All posts by Tim Obezuk

Introducing Cloudflare Browser Isolation beta

Post Syndicated from Tim Obezuk original https://blog.cloudflare.com/browser-beta/

Introducing Cloudflare Browser Isolation beta

Reimagining the Browser

Introducing Cloudflare Browser Isolation beta

Web browsers are the culprit behind 70% of endpoint compromises. The same application that connects users to the entire Internet also connects you to all of the potentially harmful parts of the Internet. It’s an open door to nearly every connected system on the planet, which is powerful and terrifying.

We also rely on browsers more than ever. Most applications that we use live in a browser and that will continue to increase. For more and more organizations, a corporate laptop is just a managed web browser machine.

To keep those devices safe, and the data they hold or access, enterprises have started to deploy “browser isolation” services where the browser itself doesn’t run on the machine. Instead, the browser runs on a virtual machine in a cloud provider somewhere. By running away from the device, threats from the browser stay on that virtual machine somewhere in the cloud.

However, most isolation solutions take one of two approaches that both ruin the convenience and flexibility of a web browser:

  • Record the isolated browser and send a live stream of it to the user, which is slow and makes it difficult to do basic things like input text to a form.
  • Unpack the webpage, inspect it, repack it and send it to the user – sometimes missing threats or more often failing to repack the webpage in a way that it still works.

Today, we’re excited to open up a beta of a third approach to keeping web browsing safe with Cloudflare Browser Isolation. Browser sessions run in sandboxed environments in Cloudflare data centers in 200 cities around the world, bringing the remote browser milliseconds away from the user so it feels like local web browsing.

Instead of streaming pixels to the user, Cloudflare Browser Isolation sends the final output of a browser’s web page rendering. The approach means that the only thing ever sent to the device is a package of draw commands to render the webpage, which also makes Cloudflare Browser Isolation compatible with any HTML5 compliant browser.

The result is a browser that just feels like a browser, while keeping threats far away from the device.

We’re inviting users to sign up for the beta today as part of Zero Trust week at Cloudflare. If you’re interested in signing up now, visit the bottom of this post. If you’d like to find out how this works, keep reading.

The unexpected universal productivity application

While it never quite became the replacement operating system Marc Andreessen predicted in 1995, the web browser is perhaps the most important application today on end-user devices. In the workplace, many people spend the majority of their at-work computer time entirely within a web browser connected to internal apps and external SaaS applications and services. As this has occurred, browsers have needed to become increasingly complex — to address the expanding richness of the web and the demands of modern web applications such as Office 365 and Google Workplace.

However, despite the pivotal and ubiquitous role of web browsers, they are the least controlled application in the enterprise. Businesses struggle to control how users interact with web browsers. It’s all too easy for a user to inadvertently download an infected file, install a malicious extension, upload sensitive company data or click a malicious zero-day link in an email or on a webpage.

Making the problem worse is the growing prevalence of BYOD. It makes it difficult to enforce which browsers are used or if they are properly patched. Mobile device management (MDM) is a step in the right direction, but just like the slow patching cycles of on-premise firewalls, MDM can often be too slow to protect against zero day threats. I’ve been the recipient of many mass emails from CISO’s reminding everyone to patch their browser and to do it right now because this time it’s “really important” (CVE-2019-5786).

Reimagining the browser

Earlier this week we announced Cloudflare One, which is our vision for the future of the corporate network. The fundamental approach we’ve taken is a blank sheet: to zero out all the assumptions of the old model (like castle-and-moat) and usher in a new model based on the complex nature of today’s corporate networking and the shift to Zero Trust, cloud-based networking-as-a-service.

It would be impossible to do this without thinking about the browser. Remote computing technologies have offered the promise of fixing the problems of the browser for some time — a future where anyone can benefit from the security and scale of cloud computing on their personal device. The reality has been that getting a generally performant solution is much more difficult than it sounds. It requires sending a user’s input over the Internet, computing that input, retrieving resources off the web, and then streaming them back to the user. And it all must occur in milliseconds, to create an illusion of using a local piece of software.

The general experience has been terrible, and many implementations have created nothing but angry emails and help-desk tickets for IT folks.

It is a tough problem, and it’s something we’ve been hard at work at solving. By delivering a vector-based stream that scales across any display size without requiring high bandwidth connections we’re able to reproduce the native browser experience remotely. Users experience the website as it was intended, without all the compatibility issues introduced by scrubbing HTML, CSS and JavaScript. And performance issues are aided tremendously by the fact that the managed browser is hosted only milliseconds away on our network.

How secure remote browsing fits in with Cloudflare for Teams

Before Cloudflare Browser Isolation, Cloudflare for Teams consisted of two core services:

Cloudflare Access creates a Zero Trust network perimeter that allows users to access corporate applications without needing to poke holes in their internal network with a legacy VPN appliance.

Cloudflare Gateway creates a Secure Web Gateway that protects users from threats on any website.

These tools are excellent for protecting private Internet properties from unauthorized access and web browsing activity from known malicious websites. But what about unknown and unforeseeable threats?

Cloudflare Browser Isolation answers this question by sandboxing a web browser in a remote container that is easily disposed of at the end of the user’s browsing session or when compromised.

Should an unknown threat such as a zero day vulnerability or malicious website exploit any of the hundreds of Web APIs, the attack is limited to a browser running in a supervised cloud environment leaving the end-user’s device unaffected.

The Network is the Computer®

Web browsers are the foundation that the shift to the cloud has been built on. It’s just that they’ve always run in the wrong place.

In the same way that it made no sense for a developer to run and maintain the hardware that their application runs on, the same exact case can be made for the other side of the cloud’s equation: the browser. Funnily enough, the solution is exactly the same: like the developer’s application, the browser needed to move to the cloud. However, as with all disruptions, it takes time and investment for the performance of the new technology to catch up to the old one. When AWS was first launched in 2006, the inherent limitations meant that for most developers, it made sense to continue to run on-premise solutions.

At some point though, the technology improves to the point where the disruption can start taking over from the previous paradigm.

The limiting factor until today for a cloud-based browser has often been the experience of using it. A user’s experience is limited by the speed of light; it limits the time it takes a user’s input to travel to the remote data center and be returned to their display. In a perfect world, this needs to occur within milliseconds to deliver a real time experience.

Cloudflare has one very big advantage in solving that problem.

Introducing Cloudflare Browser Isolation beta

To deliver real-time remote computing experiences, each of our 200+ data centers are capable of serving remote browsing sessions within the blink of an eye of nearly everyone connected to the Internet. This allows us to deliver a low latency, responsive stream of a webpage regardless of where you’re physically located.

What’s next?

But that’s enough talking about it. We’d love for you to try it! Please complete the form here to sign up to be one of the first users of this new technology in our network. We’ll be in touch as we expand the beta to more users.

Backblaze B2 and the S3 Compatible API on Cloudflare

Post Syndicated from Tim Obezuk original https://blog.cloudflare.com/backblaze-b2-and-the-s3-compatible-api-on-cloudflare/

Backblaze B2 and the S3 Compatible API on Cloudflare

In May 2020, Backblaze, a founding Bandwidth Alliance partner announced S3 compatible APIs for their B2 Cloud Storage service. As a refresher, the Bandwidth Alliance is a group of forward-thinking cloud and networking companies that are committed to discounting or waiving data transfer fees for shared customers. Backblaze has been a proud partner since 2018. We are excited to see Backblaze introduce a new level of compatibility in their Cloud Storage service.

History of the S3 API

First let’s dive into the history of the S3 API and why it’s important for Cloudflare users.

Prior to 2006, before the mass migration to the Cloud, if you wanted to store content for your company you needed to build your own expensive and highly available storage platform that was large enough to store all your existing content with enough growth headroom for your business. AWS launched to help eliminate this model by renting their physical computing and storage infrastructure.

Amazon Simple Storage Service (S3) led the market by offering a scalable and resilient tool for storing unlimited amounts of data without building it yourself. It could be integrated into any application but there was one catch: you couldn’t use any existing standard such as WebDAV, FTP or SMB: your application needed to interface with Amazon’s bespoke S3 API.

Fast forward to 2020 and the storage provider landscape has become highly competitive with many providers capable of providing petabyte (and exabyte) scale content storage at extremely low cost-per-gigabyte. However, Amazon S3 has remained a dominant player despite heavy competition and not being the most cost-effective player.

The broad adoption of the S3 API by developers in their codebases and internal systems has transformed the S3 API into what WebDAV promised us to be: de facto standard HTTP File Storage API.

Engineering costs of changing storage providers

With many code bases and legacy applications being entrenched in the S3 API, the process to switch to a more cost-effective storage provider is not so easy. Companies need to consider the cost of engineer time programming a new storage API while also physically moving their data.

This engineering overhead has led many storage providers to natively support the S3 API, leveling the playing field and allowing companies to focus on picking the most cost-effective provider.

First-mile bandwidth costs and the Bandwidth Alliance

Cloudflare caches content in Points of Presence located in more than 200 cities around the world. This cached content is then handed to your Internet service provider (ISP) over low cost and often free Internet exchange connections in the same facility using mutual fibre optic cables. This cost saving is fairly well understood as the benefit of content delivery networks and has become highly commoditized.

What is less well understood is the first-mile cost of moving data from a storage provider to the content delivery network. Typically storage providers expect traffic to route via the Internet and will charge the consumer per-gigabyte of data transmitted. This is not the case for Cloudflare as we also share facilities and mutual fibre optic cables with many storage providers.

Backblaze B2 and the S3 Compatible API on Cloudflare

These shared interconnects created an opportunity to waive the cost of first-mile bandwidth between Cloudflare and many providers and is what prompted us to create the Bandwidth Alliance.

Media and entertainment companies serving user-generated content have a continuous supply of new content being moved over the first-mile from the storage provider to the content delivery network. The first-mile bandwidth cost adds up and using a Bandwidth Alliance partner such Backblaze can entirely eliminate it.

Using the S3 API in Cloudflare Workers

The Solutions Engineering team at Cloudflare is tasked with providing strategic technical guidance for our enterprise customers.

It’s not uncommon for developers to connect Cloudflare’s global network directly to their storage provider and directly serve content such as live and on-demand video without an intermediate web server.

For security purposes engineers typically use Cloudflare Workers to sign each uncached request using the S3 API. Cloudflare Workers allows anyone to deploy code to our global network of over 200+ Points of Presence in seconds and is built on top of Service Workers.

We’ve tested Backblaze B2’s S3 Compatible API in Cloudflare Workers using the same code tested for Amazon S3 buckets and it works perfectly by changing the target endpoint.

Creating a S3 Compatible Worker script

Here’s how it is done using Cloudflare Worker’s CLI tool Wrangler:

Generate a new project in Wrangler using a template intended for use with Amazon S3:

wrangler generate <projectname> https://github.com/obezuk/worker-signed-s3-template

This template uses aws4fetch. A fast, lightweight implementation of an S3 Compatible signing library that is commonly used in Service Worker environments like Cloudflare Workers.

The template creates an index.js file with a standard request signing implementation:

import { AwsClient } from 'aws4fetch'

const aws = new AwsClient({
    "accessKeyId": AWS_ACCESS_KEY_ID,
    "secretAccessKey": AWS_SECRET_ACCESS_KEY,
    "region": AWS_DEFAULT_REGION
});

addEventListener('fetch', function(event) {
    event.respondWith(handleRequest(event.request))
});

async function handleRequest(request) {
    var url = new URL(request.url);
    url.hostname = AWS_S3_BUCKET;
    var signedRequest = await aws.sign(url);
    return await fetch(signedRequest, { "cf": { "cacheEverything": true } });
}

Environment Variables

Modify your wrangler.toml file to use your Backblaze B2 API Key ID and Secret:

[env.dev]
vars = { AWS_ACCESS_KEY_ID = "<BACKBLAZE B2 keyId>", 
AWS_SECRET_ACCESS_KEY = "<BACKBLAZE B2 secret>", 
AWS_DEFAULT_REGION = "", 
AWS_S3_BUCKET = "<BACKBLAZE B2 bucketName>.<BACKBLAZE B2 S3 Endpoint>"}

AWS_S3_BUCKET environment variable will be the combination of your bucket name, period and S3 Endpoint. For a Backblaze B2 Bucket named example-bucket and S3 Endpoint s3.us-west-002.backblazeb2.com use example-bucket.s3.us-west-002.backblazeb2.com

AWS_DEFAULT_REGION environment variable is interpreted from your S3 Endpoint. I use us-west-002.

We recommend using Secret Environment variables to store your AWS_SECRET_ACCESS_KEY content when using this script in production.

Preview your Cloudflare Worker

Next run wrangler preview --env dev to enter a preview window of your Worker script. My bucket contained a static website containing adaptive streaming video content stored in a Backblaze B2 bucket.

Backblaze B2 and the S3 Compatible API on Cloudflare
Note: We permit caching of third party video content only for enterprise domains. Free/Pro/Biz users wanting to serve video content via Cloudflare may use Stream which delivers an end-to-end video delivery service.

Backblaze B2’s compatibility for the S3 API is an exciting update that has made their storage platform highly compatible with existing code bases and legacy systems. And, as a special offer to Cloudflare blog readers, Backblaze will pay the migration costs for transferring your data from S3 to Backblaze B2 (click here for more detail). With the cost of migration covered and compatibility for your existing workflows, it is now easier than ever to switch to a Bandwidth Alliance partner and save on first-mile costs. By doing so, you can slash your cloud bills, gain flexibility, and make no compromises to your performance.

To learn more, join us on May 14th for a webinar focused on getting you ultra fast worldwide content delivery.

Unit Testing Workers, in Cloudflare Workers

Post Syndicated from Tim Obezuk original https://blog.cloudflare.com/unit-testing-workers-in-cloudflare-workers/

Unit Testing Workers, in Cloudflare Workers

Unit Testing Workers, in Cloudflare Workers

We recently wrote about unit testing Cloudflare Workers within a mock environment using CloudWorker (a Node.js based mock Cloudflare Worker environment created by Dollar Shave Club’s engineering team). See Unit Testing Worker Functions.

Even though Cloudflare Workers deploy globally within seconds, software developers often choose to use local mock environments to have the fastest possible feedback loop while developing on their local machines. CloudWorker is perfect for this use case but as it is still a mock environment it does not guarantee an identical runtime or environment with all Cloudflare Worker APIs and features. This gap can make developers uneasy as they do not have 100% certainty that their tests will succeed in the production environment.

In this post, we’re going to demonstrate how to generate a Cloudflare Worker compatible test harness which can execute mocha unit tests directly in the production Cloudflare environment.

Directory Setup

Create a new folder for your project, change it to your working directory and run npm init to initialise the package.json file.

Run mkdir -p src && mkdir -p test/lib && mkdir dist to create folders used by the next steps. Your folder should look like this:

.
./dist
./src/worker.js
./test
./test/lib
./package.json

npm install --save-dev mocha exports-loader webpack webpack-cli

This will install Mocha (the unit testing framework), Webpack (a tool used to package the code into a single Worker script) and Exports Loader (a tool used by Webpack to import the Worker script into the Worker based Mocha environment.

npm install --save-dev git+https://github.com/obezuk/mocha-loader.git

This will install a modified version of Webpack’s mocha loader. It has been modified to support the Web Worker environment type. We are excited to see Web Worker support merged into Mocha Loader so please vote for our pull request here: https://github.com/webpack-contrib/mocha-loader/pull/77

Example Script

Create your Worker script in ./src/worker.js:

addEventListener('fetch', event => {
 event.respondWith(handleRequest(event.request))
})

async function addition(a, b) {
  return a + b
}

async function handleRequest(request) {
  const added = await addition(1,3)
  return new Response(`The Sum is ${added}!`)
}

Add Tests

Create your unit tests in ./test/test.test.js:

const assert = require('assert')

describe('Worker Test', function() {

    it('returns a body that says The Sum is 4', async function () {
        let url = new URL('https://worker.example.com')
        let req = new Request(url)
        let res = await handleRequest(req)
        let body = await res.text()
        assert.equal(body, 'The Sum is 4!')
    })

    it('does addition properly', async function() {
        let res = await addition(1, 1)
        assert.equal(res, 2)
    })

})

Mocha in Worker Test Harness

In order to execute mocha and unit tests within Cloudflare Workers we are going to build a Test Harness. The Test Harness script looks a lot like a normal Worker script but integrates your ./src/worker.js and ./test/test.test.js into a script which is capable of executing the Mocha unit tests within the Cloudflare Worker runtime.

Create the below script in ./test/lib/serviceworker-mocha-harness.js.

import 'mocha';

import 'mocha-loader!../test.test.js';

var testResults;

async function mochaRun() {
    return new Promise(function (accept, reject) {
        var runner = mocha.run(function () {
            testResults = runner.testResults;
            accept();
        });
    });
}

addEventListener('fetch', event => {
    event.respondWith(handleMochaRequest(event.request))
});

async function handleMochaRequest(request) {

    if (!testResults) {
        await mochaRun();
    }

    var headers = new Headers({
        "content-type": "application/json"
    })

    var statusCode = 200;

    if (testResults.failures != 0) {
        statusCode = 500;
    }

    return new Response(JSON.stringify(testResults), {
        "status": statusCode,
        "headers": headers
    });

}

Object.assign(global, require('exports-loader?handleRequest,addition!../../src/worker.js'));

Mocha Webpack Configuration

Create a new file in the project root directory called: ./webpack.mocha.config.js. This file is used by Webpack to bundle the test harness, worker script and unit tests into a single script that can be deployed to Cloudflare.

module.exports = {
  target: 'webworker',
  entry: "./test/lib/serviceworker-mocha-harness.js",
  mode: "development",
  optimization: {
    minimize: false
  },
  performance: {
    hints: false
  },
  node: {
    fs: 'empty'
  },
  module: {
    exprContextCritical: false
  },
  output: {
    path: __dirname + "/dist",
    publicPath: "dist",
    filename: "worker-mocha-harness.js"
  }
};

Your file structure should look like (excluding node_modules):

.
./dist
./src/worker.js
./test/test.test.js
./test/lib/serviceworker-mocha-harness.js
./package.json
./package-lock.json
./webpack.mocha.config.js

Customising the test harness.

If you wish to extend the test harness to support your own test files you will need to add additional test imports to the top of the script:

import 'mocha-loader!/* TEST FILE NAME HERE */'

If you wish to import additional functions from your Worker script into the test harness environment you will need to add them comma separated into the last line:

Object.assign(global, require('exports-loader?/* COMMA SEPARATED FUNCTION NAMES HERE */!../../src/worker.js'));

Running the test harness

Deploying and running the test harness is identical to deploying any other Worker script with Webpack.

Modify the scripts section of package.json to include the build-harness command.

"scripts": {
  "build-harness": "webpack --config webpack.mocha.config.js -p --progress --colors"
}

In the project root directory run the command npm run build-harness to generate and bundle your Worker script, Mocha and your unit tests into ./dist/worker-mocha-harness.js.

Upload this script to a test Cloudflare workers route and run curl --fail https://test.example.org. If the unit tests are successful it will return a 200 response, and if the unit tests fail a 500 response.

Integrating into an existing CI/CD pipeline

You can integrate Cloudflare Workers and the test harness into your existing CI/CD pipeline by using our API: https://developers.cloudflare.com/workers/api/.

The test harness returns detailed test reports in JSON format:

Example Success Response

{
  "stats": {
    "suites": 1,
    "tests": 2,
    "passes": 2,
    "pending": 0,
    "failures": 0,
    "start": "2019-04-23T06:24:33.492Z",
    "end": "2019-04-23T06:24:33.590Z",
    "duration": 98
  },
  "tests": [
    {
      "title": "returns a body that says The Sum is 4",
      "fullTitle": "Worker Test returns a body that says The Sum is 4",
      "duration": 0,
      "currentRetry": 0,
      "err": {}
    },
    {
      "title": "does addition properly",
      "fullTitle": "Worker Test does addition properly",
      "duration": 0,
      "currentRetry": 0,
      "err": {}
    }
  ],
  "pending": [],
  "failures": [],
  "passes": [
    {
      "title": "returns a body that says The Sum is 4",
      "fullTitle": "Worker Test returns a body that says The Sum is 4",
      "duration": 0,
      "currentRetry": 0,
      "err": {}
    },
    {
      "title": "does addition properly",
      "fullTitle": "Worker Test does addition properly",
      "duration": 0,
      "currentRetry": 0,
      "err": {}
    }
  ]
}

Example Failure Response

{
  "stats": {
    "suites": 1,
    "tests": 2,
    "passes": 0,
    "pending": 0,
    "failures": 2,
    "start": "2019-04-23T06:25:52.100Z",
    "end": "2019-04-23T06:25:52.170Z",
    "duration": 70
  },
  "tests": [
    {
      "title": "returns a body that says The Sum is 4",
      "fullTitle": "Worker Test returns a body that says The Sum is 4",
      "duration": 0,
      "currentRetry": 0,
      "err": {
        "name": "AssertionError",
        "actual": "The Sum is 5!",
        "expected": "The Sum is 4!",
        "operator": "==",
        "message": "'The Sum is 5!' == 'The Sum is 4!'",
        "generatedMessage": true,
        "stack": "AssertionError: 'The Sum is 5!' == 'The Sum is 4!'\n    at Context.<anonymous> (worker.js:19152:16)"
      }
    },
    {
      "title": "does addition properly",
      "fullTitle": "Worker Test does addition properly",
      "duration": 0,
      "currentRetry": 0,
      "err": {
        "name": "AssertionError",
        "actual": "3",
        "expected": "2",
        "operator": "==",
        "message": "3 == 2",
        "generatedMessage": true,
        "stack": "AssertionError: 3 == 2\n    at Context.<anonymous> (worker.js:19157:16)"
      }
    }
  ],
  "pending": [],
  "failures": [
    {
      "title": "returns a body that says The Sum is 4",
      "fullTitle": "Worker Test returns a body that says The Sum is 4",
      "duration": 0,
      "currentRetry": 0,
      "err": {
        "name": "AssertionError",
        "actual": "The Sum is 5!",
        "expected": "The Sum is 4!",
        "operator": "==",
        "message": "'The Sum is 5!' == 'The Sum is 4!'",
        "generatedMessage": true,
        "stack": "AssertionError: 'The Sum is 5!' == 'The Sum is 4!'\n    at Context.<anonymous> (worker.js:19152:16)"
      }
    },
    {
      "title": "does addition properly",
      "fullTitle": "Worker Test does addition properly",
      "duration": 0,
      "currentRetry": 0,
      "err": {
        "name": "AssertionError",
        "actual": "3",
        "expected": "2",
        "operator": "==",
        "message": "3 == 2",
        "generatedMessage": true,
        "stack": "AssertionError: 3 == 2\n    at Context.<anonymous> (worker.js:19157:16)"
      }
    }
  ],
  "passes": []
}

This is really powerful and can allow you to execute your unit tests directly in the Cloudflare runtime, giving you more confidence before releasing your code into production. We hope this was useful and welcome any feedback.