Tag Archives: test

Court Cracks Down on ‘Future’ Pirate Mayweather-McGregor Streams

Post Syndicated from Ernesto original https://torrentfreak.com/court-cracks-down-on-future-pirate-mayweather-mcgregor-streams-170821/

This weekend, the undefeated Floyd Mayweather Jr. will go head-to-head with UFC lightweight champion Conor McGregor at the T-Mobile Arena in Las Vegas.

The fight is not just about prestige, but also about money. Some predict that the unusual matchup could pull in a staggering one billion dollars.

A significant portion of this will go to each of the fighters, but rightsholders such as Showtime benefit as well.

People who want to stream the event live over the Internet will have to cough up between $89.95 and $99.99. This will generate millions of dollars in revenue but the numbers would be even higher if it wasn’t so easy to stream the fight through pirate sites.

This is why Showtime took some of the most brazen pirate sites to court last week, demanding an injunction to stop the pirated streams before they even start. In its complaint, the cable TV provider listed 44 domain names which advertise the fight, urging the court to shut them down pre-emptively.

A few of the 44 targeted (sub)domains.

After reviewing the application, United States District Judge André Birotte Jr. approved the preliminary injunction, which forbids the site’s operators from offering infringing streams. The injunction stays in place until August 28, two days after the event.

While the order is a clear win for Showtime, it’s unclear how effective it will be. The sites in question are all believed to be connected to LiveStreamHDQ and its alleged operator “Kopa Mayweather,” who Showtime have battled before.

At the time of writing, the sites are all still online, although the language appears to have changed. Many now have articles explaining how the fight can be watched legally. Whether it remains that way has to be seen.

Updated ‘pirate’ site

Interestingly, the injunction doesn’t mention any domain name registrars or registries. When Showtime applied for similar measures in the past, the company specifically asked to take control of domain names, so these couldn’t be used for any infringing activity.

That said, the current order applies to the defendants and any others who are “in active concert or participation” with them, so this might be enough for domain registrars and other parties to take appropriate action.

Showtime also has the possibility to request updates to the injunction, if needed, but with only a few days to go this has to happen swiftly.

As mentioned earlier, this is not the first time that Showtime has gone after alleged pirates before they get a chance to commit an offense. The company launched similar cases for the Mayweather vs. Pacquiao and Mayweather vs. Berto matchups in 2015.

While these efforts were successful in taking a few pirate sites down, there were plenty of unauthorized streams available when the events started. This time it’s not likely to be any different. With hundreds of live streaming sites and tools out there, piracy will remain undefeated.

A copy of the preliminary injunction is available here (pdf).

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Hunting for life on Mars assisted by high-altitude balloons

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/eclipse-high-altitude-balloons/

Will bacteria-laden high-altitude balloons help us find life on Mars? Today’s eclipse should bring us closer to an answer.

NASA Bacteria Balloons Raspberry Pi HAB Life on Mars

image c/o NASA / Ames Research Center / Tristan Caro

The Eclipse Ballooning Project

Having learned of the Eclipse Ballooning Project set to take place today across the USA, a team at NASA couldn’t miss the opportunity to harness the high-flying project for their own experiments.

NASA Bacteria Balloons Raspberry Pi HAB Life on Mars

The Eclipse Ballooning Project invited students across the USA to aid in the launch of 50+ high-altitude balloons during today’s eclipse. Each balloon is equipped with its own Raspberry Pi and camera for data collection and live video-streaming.

High-altitude ballooning, or HAB as it’s often referred to, has become a popular activity within the Raspberry Pi community. The lightweight nature of the device allows for high ascent, and its Camera Module enables instant visual content collection.

Life on Mars

image c/o Montana State University

The Eclipse Ballooning Project team, headed by Angela Des Jardins of Montana State University, was contacted by Jim Green, Director of Planetary Science at NASA, who hoped to piggyback on the project to run tests on bacteria in the Mars-like conditions the balloons would encounter near space.

Into the stratosphere

At around -35 degrees Fahrenheit, with thinner air and harsher ultraviolet radiation, the conditions in the upper part of the earth’s stratosphere are comparable to those on the surface of Mars. And during the eclipse, the moon will block some UV rays, making the environment in our stratosphere even more similar to the martian oneideal for NASA’s experiment.

So the students taking part in the Eclipse Ballooning Project could help the scientists out, NASA sent them some small metal tags.

NASA Bacteria Balloons Raspberry Pi HAB Life on Mars

These tags contain samples of a kind of bacterium known as Paenibacillus xerothermodurans. Upon their return to ground, the bacteria will be tested to see whether and how the high-altitude conditions affected them.

Life on Mars

Paenibacillus xerothermodurans is one of the most resilient bacterial species we know. The team at NASA wants to discover how the bacteria react to their flight in order to learn more about whether life on Mars could possibly exist. If the low temperature, UV rays, and air conditions cause the bacteria to mutate or indeed die, we can be pretty sure that the existence of living organisms on the surface of Mars is very unlikely.

Life on Mars

What happens to the bacteria on the spacecraft and rovers we send to space? This experiment should provide some answers.

The eclipse

If you’re in the US, you might have a chance to witness the full solar eclipse today. And if you’re planning to watch, please make sure to take all precautionary measures. In a nutshell, don’t look directly at the sun. Not today, not ever.

If you’re in the UK, you can observe a partial eclipse, if the clouds decide to vanish. And again, take note of safety measures so you don’t damage your eyes.

Life on Mars

You can also watch a live-stream of the eclipse via the NASA website.

If you’ve created an eclipse-viewing Raspberry Pi project, make sure to share it with us. And while we’re talking about eclipses and balloons, check here for our coverage of the 2015 balloon launches coinciding with the UK’s partial eclipse.

The post Hunting for life on Mars assisted by high-altitude balloons appeared first on Raspberry Pi.

Healthy Aussie Pirates Set To Face Cash ‘Fines’, Poor & Sick Should Be OK

Post Syndicated from Andy original https://torrentfreak.com/healthy-aussie-pirates-set-to-face-cash-fines-poor-sick-should-be-ok-170821/

One of the oldest methods of trying to get people to stop downloading and sharing pirated material is by hitting them with ‘fines’.

The RIAA began the practice in September 2003, tracking people sharing music on early peer-to-peer networks, finding out their identities via ISPs, and sending them cease-and-desist orders with a request to pay hundreds to thousands of dollars.

Many thousands of people were fined and the campaign raised awareness, but it did nothing to stop millions of file-sharers who continue to this day.

That’s something that Village Roadshow co-chief Graham Burke now wants to do something about. He says his company will effectively mimic the RIAA’s campaign of 14 years ago and begin suing Internet pirates Down Under. He told AFR that his company is already setting things up, ready to begin suing later in the year.

Few details have been made available at this stage but it’s almost certain that Village Roadshow’s targets will be BitTorrent users. It’s possible that users of other peer-to-peer networks could be affected but due to their inefficiency and relative obscurity, it’s very unlikely.

That leaves users of The Pirate Bay and any other torrent site vulnerable to the company, which will jump into torrent swarms masquerading as regular users, track IP addresses, and trace them back to Internet service providers. What happens next will depend on the responses of those ISPs.

If the ISPs refuse to cooperate, they will have to be taken to court to force them to hand over the personal details of their subscribers to Village Roadshow. It’s extremely unlikely they’ll hand them over voluntarily, so it could be some time before any ISP customer hears anything from the film distributor.

The bottom line is that Village Roadshow will want money to go away and Burke is already being open over the kind of sums his company will ask for.

“We will be looking for damages commensurate with what they’ve done. We’ll be saying ‘You’ve downloaded our Mad Max: Fury Road, our Red Dog, and we want $40 for the four movies plus $200 in costs’,” he says.

While no one will relish any kind of ‘bill’ dropping through a mail box, in the scheme of things a AUS$240 settlement demand isn’t huge, especially when compared to the sums demanded by companies such as Voltage Pictures, who tried and failed to start piracy litigation in Australia two years ago.

However, there’s even better news for some, who have already been given a heads-up that they won’t have to pay anything.

“We will identify people who are stealing our product, we will ask them do they have ill health or dire circumstances, and if they do and undertake to stop, we’ll drop the case,” Burke says.

While being upfront about such a policy has its pros and cons, Burke is also reducing his range of targets, particularly if likes to be seen as a man of his word, whenever those words were delivered. In March 2016, when he restated his intention to begin suing pirates, he also excluded some other groups from legal action.

“We don’t want to sue 16-year-olds or mums and dads,” Burke said. “It takes 18 months to go through the courts and all that does is make lawyers rich and clog the court system. It’s not effective.”

It will remain to be seen what criteria Village Roadshow ultimately employs but it’s likely the company will be asked to explain its intentions to the court, when it embarks on the process to discover alleged pirates’ identities. When it’s decided who is eligible, Burke says the gloves will come off, with pirates being “pursued vigorously” and “sued for damages.”

While Village Roadshow’s list of films is considerable, any with a specifically Australian slant seem the most likely to feature in any legal action. Burke tends to push the narrative that he’s looking after local industry so something like Mad Max: Fury Road would be perfect. It would also provide easy pickings for any anti-piracy company seeking to harvest Aussie IP addresses since it’s still very popular.

Finally, it’s worth noting that Australians who use pirate streaming services will be completely immune to the company’s planned lawsuit campaign. However, Burke appears to be tackling that threat using a couple of popular tactics currently being deployed elsewhere by the movie industry.

“Google are not doing enough and could do a lot more,” he told The Australian (subscription)

Burke said that he was “shocked” at how easy it was to find streaming content using Google’s search so decided to carry out some research of his own at home. He said he found Christopher Nolan’s Dunkirk with no difficulty but that came with a sting in the tail.

According to the movie boss, his computer was immediately infected with malware and began asking for his credit card details. He doesn’t say whether he put them in.

As clearly the world’s most unlucky would-be movie pirate, Burke deserves much sympathy. It’s also completely coincidental that Hollywood is now pushing a “danger” narrative to keep people away from pirate sites.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Top 10 Most Pirated Movies of The Week on BitTorrent – 08/21/17

Post Syndicated from Ernesto original https://torrentfreak.com/top-10-pirated-movies-week-bittorrent-082117/

This week we have two newcomers in our chart.

Baywatch is the most downloaded movie.

The data for our weekly download chart is estimated by TorrentFreak, and is for informational and educational reference only. All the movies in the list are Web-DL/Webrip/HDRip/BDrip/DVDrip unless stated otherwise.

RSS feed for the weekly movie download chart.

This week’s most downloaded movies are:
Movie Rank Rank last week Movie name IMDb Rating / Trailer
Most downloaded movies via torrents
1 (…) Baywatch 5.7 / trailer
2 (1) Guardians of the Galaxy Vol. 2 8.0 / trailer
3 (2) The Mummy 2017 5.8 / trailer
4 (3) King Arthur: Legend of the Sword 7.2 / trailer
5 (6) Wonder Woman (Subbed HDrip) 8.2 / trailer
6 (4) Spider-Man: Homecoming (HDTS) 8.0 / trailer
7 (…) Security 8.2 / trailer
8 (5) The Boss Baby 6.5 / trailer
9 (10) Despicable Me 3 (HDTS) 6.4 / trailer
10 (9) Ghost In the Shell 6.8 / trailer

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

The Windows App Store is Full of Pirate Streaming Apps

Post Syndicated from Ernesto original https://torrentfreak.com/the-windows-app-store-is-full-of-pirate-streaming-apps-170820/

Over the past few years it has become much easier to stream movies and TV-shows over the Internet.

Legal streaming services such as Netflix and Amazon are booming. At the same time, however, there’s also a dark market of thousands of pirate streaming tools.

In recent months, Hollywood has directed many its anti-piracy efforts towards unauthorized Kodi-addons and several popular pirate streaming sites, which offer movies and TV-shows without permission. What seems to be largely ignored, however, is a “store” that hundreds of millions of people have access to; the Windows App Store.

When we were browsing through the “top free” apps in the Windows Store, our attention was drawn to several applications that promoted “free movies” including various Hollywood blockbusters such as “Wonder Woman,” “Spider-Man: Homecoming,” and “The Mummy.”

Initially, we assumed that a pirate app may have slipped passed Microsoft’s screening process. However, the ‘problem’ doesn’t appear to be isolated. There are dozens of similar apps in the official store that promise potential users free movies, most with rave reviews.

Some of the many pirate apps in the “trusted” store

Most of the applications work on multiple platforms including PC, mobile, and the Xbox. They are pretty easy to use and rely on the familiar grid-based streaming interface most sites and services use. Pick a movie or TV-show, click the play button, and off you go.

The sheer number of piracy apps in the Windows Store, using names such as “Free Movies HD,” “Free Movies Online 2020,” and “FreeFlix HQ,” came as a surprise to us. In particular, because the developers make no attempt to hide their activities, quite the opposite.

The app descriptions are littered with colorful language offering the latest Hollywood movies, and thousands of others, without charge. In addition, the apps display their capabilities in various screenshots, including those showing movies that are not yet available on legal streaming platforms.

Screenshot provided by the Windows app store

Making matters worse, the applications show advertising as well, including high-quality pre-roll ads. Some of these appear to be facilitated through Microsoft’s own Ad Monetization platform. Other apps offer paid versions or in-app purchases to monetize their service.

After hours of going through the pirate app offerings, it’s clear that Microsoft’s “trusted” Windows Store is ridden with unauthorized content. Thus far we have only mentioned video, but the issue also applies to pirated music in the form of dedicated streaming and download apps.

Earlier this year, Microsoft signed a landmark anti-piracy agreement with several major copyright holders, to address pirate search results in the Bing search engine. The above makes clear that search results in the Microsoft Store store may require some attention too.

TorrentFreak reached out to Microsoft, asking for a comment on our findings, but at the time of publication we haven’t yet heard back.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Streaming Service iflix Buys Shows Based on Piracy Data

Post Syndicated from Ernesto original https://torrentfreak.com/streaming-service-iflix-buys-shows-based-on-piracy-data-170819/

When major movie and TV companies discuss piracy they often mention the massive losses incurred as a result of unauthorized downloads and streams.

However, this unofficial market also offers a valuable pool of often publicly available data on the media consumption habits of a relatively young generation.

Many believe that piracy is in part a market signal showing copyright holders what consumers want. This makes piracy statistics key business intelligence, which some companies have started to realize.

Netflix, for example, previously said that their offering is partly based on what shows do well on BitTorrent networks and other pirate sites. In addition, the streaming service also uses piracy to figure out how much they can charge in a country. They are not alone.

Other major entertainment companies also keep a close eye on piracy, using this data to their advantage. This includes the Asia-based streaming portal iFlix, which recently secured $133 million in funding and boasts to have over five million users.

Iflix co-founder Patrick Grove says that his company actively uses piracy numbers to determine what content they acquire. The data reveal what is popular locally, and help to give viewers the TV-shows and movies they’re most interested in.

“We looked at piracy data in every market,” Grove informed CNBC’s Managing Asia, which doesn’t stop at looking at a few torrent download numbers.

Representatives from the Asian company actually went out on the streets to buy pirated DVDs from street vendors. In addition, iflix also received help from local Internet providers which shared a variety of streaming data.

TorrentFreak reached out to the streaming service to get more details about their data gathering techniques. One of the main partners to measure online piracy is the German company TECXIPIO, which is known to actively monitor BitTorrent traffic.

The company also maintains a close relationship with Internet providers that offer further insight, including streaming data, to determine which titles work best in each market.

While analyzing the different sets of data, the streaming service was surprised to see the diversity in different regions as well as the ever-changing consumer demand.

“Through looking at the Top 20 pirated DVDs in every market we are live in, we were surprised to find the amount of pirated K-drama content. In Ghana for example, the number one pirated title is K-drama series called ‘Legend of the Blue Sea’,” an iflix spokesperson told us.

Iflix believes that piracy data is superior to other market intelligence. Before rolling out its service in Saudi Arabia the company made a list of the 1,000 most popular shows and used that to its advantage.

While there is a lot of piracy in emerging markets, iflix doesn’t think that people are not willing to pay for entertainment. It just has to be available for a decent price, and that’s where they come in.

“We believe that people in emerging markets do not actively want to steal content, they do so because there is no better alternative,” the company informs us.

“As consumers become more connected, gaining access to information and cultural influences on a global scale, they want to be entertained at a world-class standard. We set out with the aim of offering an alternative that is better than piracy; by providing unlimited access to high-quality, world-class entertainment, all at the price of pirated DVD.”

There is no doubt that iflix is ambitious, and that it’s willing to employ some unusual tactics to grow its userbase. The company is quite optimistic about the future as well, judging from its co-founder’s prediction that it will welcome its billionth viewer in a few years.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Rightscorp Bleeds Another Million, Borrows $200K From Customer BMG

Post Syndicated from Andy original https://torrentfreak.com/rightscorp-bleeds-another-million-borrows-200k-from-customer-bmg-170819/

Anti-piracy outfit Rightscorp is one of the many companies trying to turn Internet piracy into profit. The company has a somewhat novel approach but has difficulty balancing the books.

Essentially, Rightscorp operates like other so-called copyright-trolling operations, in that it monitors alleged offenders on BitTorrent networks, tracks them to their ISPs, then attempts to extract a cash settlement. Rightscorp does this by sending DMCA notices with settlement agreements attached, in the hope that at-this-point-anonymous Internet users break cover in panic. This can lead to a $20 or $30 ‘fine’ or in some cases dozens of multiples of that.

But despite settling hundreds of thousands of these cases, profit has thus far proven elusive, with the company hemorrhaging millions in losses. The company has just filed its results for the first half of 2017 and they contain more bad news.

In the six months ended June 2017, revenues obtained from copyright settlements reached just $138,514, that’s 35% down on the $214,326 generated in the same period last year. However, the company did manage to book $148,332 in “consulting revenue” in the first half of this year, a business area that generated no revenue in 2016.

Overall then, total revenue for the six month period was $286,846 – up from $214,326 last year. While that’s a better picture in its own right, Rightscorp has a lot of costs attached to its business.

After paying out $69,257 to copyright holders and absorbing $1,190,696 in general and administrative costs, among other things, the company’s total operating expenses topped out at $1,296,127 for the first six months of the year.

To make a long story short, the company made a net loss of $1,068,422, which was more than the $995,265 loss it made last year and despite improved revenues. The company ended June with just $1,725 in cash.

“These factors raise substantial doubt about the Company’s ability to continue as a going concern within one year after the date that the financial statements are issued,” the company’s latest statement reads.

This hanging-by-a-thread narrative has followed Rightscorp for the past few years but there’s information in the latest accounts which indicates how bad things were at the start of the year.

In January 2016, Rightscorp and several copyright holders, including Hollywood studio Warner Bros, agreed to settle a class-action lawsuit over intimidating robo-calls that were made to alleged infringers. The defendants agreed to set aside $450,000 to cover the costs, and it appears that Rightscorp was liable for at least $200,000 of that.

Rightscorp hasn’t exactly been flush with cash, so it was interesting to read that its main consumer piracy settlement client, music publisher BMG, actually stepped in to pay off the class-action settlement.

“At December 31, 2016, the Company had accrued $200,000 related to the settlement of a class action complaint. On January 7, 2017, BMG Rights Management (US) LLC (“BMG”) advanced the Company $200,000, which was used to pay off the settlement. The advance from BMG is to be applied to future billings from the Company to BMG for consulting services,” Rightscorp’s filing reads.

With Rightscorp’s future BMG revenue now being gobbled up by what appears to be loan repayments, it becomes difficult to see how the anti-piracy outfit can make enough money to pay off the $200,000 debt. However, its filing notes that on July 21, 2017, the company issued “an aggregate of 10,000,000 shares of common stock to an investor for a purchase price of $200,000.” While that amount matches the BMG debt, the filing doesn’t reveal who the investor is.

The filing also reveals that on July 31, Rightscorp entered into two agreements to provide services “to a holder of multiple copyrights.” The copyright holder isn’t named, but the deal reveals that it’s in Rightscorp’s best interests to get immediate payment from people to whom it sends cash settlement demands.

“[Rightscorp] will receive 50% of all gross proceeds of any settlement revenue received by the Client from pre-lawsuit ‘advisory notices,’ and 37.5% of all gross proceeds received by the Client from ‘final warning’ notices sent immediately prior to a lawsuit,” the filing notes.

Also of interest is that Rightscorp has offered not to work with any of the copyright holders’ direct competitors, providing certain thresholds are met – $10,000 revenue in the first month to $100,000 after 12 months. But there’s more to the deal.

Rightscorp will also provide a number of services to this client including detecting and verifying copyright works on P2P networks, providing information about infringers, plus reporting, litigation support, and copyright protection advisory services.

For this, Rightscorp will earn $10,000 for the first three months, rising to $85,000 per month after 16 months, valuable revenue for a company fighting for its life.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Announcing the Winners of the AWS Chatbot Challenge – Conversational, Intelligent Chatbots using Amazon Lex and AWS Lambda

Post Syndicated from Tara Walker original https://aws.amazon.com/blogs/aws/announcing-the-winners-of-the-aws-chatbot-challenge-conversational-intelligent-chatbots-using-amazon-lex-and-aws-lambda/

A couple of months ago on the blog, I announced the AWS Chatbot Challenge in conjunction with Slack. The AWS Chatbot Challenge was an opportunity to build a unique chatbot that helped to solve a problem or that would add value for its prospective users. The mission was to build a conversational, natural language chatbot using Amazon Lex and leverage Lex’s integration with AWS Lambda to execute logic or data processing on the backend.

I know that you all have been anxiously waiting to hear announcements of who were the winners of the AWS Chatbot Challenge as much as I was. Well wait no longer, the winners of the AWS Chatbot Challenge have been decided.

May I have the Envelope Please? (The Trumpets sound)

The winners of the AWS Chatbot Challenge are:

  • First Place: BuildFax Counts by Joe Emison
  • Second Place: Hubsy by Andrew Riess, Andrew Puch, and John Wetzel
  • Third Place: PFMBot by Benny Leong and his team from MoneyLion.
  • Large Organization Winner: ADP Payroll Innovation Bot by Eric Liu, Jiaxing Yan, and Fan Yang

 

Diving into the Winning Chatbot Projects

Let’s take a walkthrough of the details for each of the winning projects to get a view of what made these chatbots distinctive, as well as, learn more about the technologies used to implement the chatbot solution.

 

BuildFax Counts by Joe Emison

The BuildFax Counts bot was created as a real solution for the BuildFax company to decrease the amount the time that sales and marketing teams can get answers on permits or properties with permits meet certain criteria.

BuildFax, a company co-founded by bot developer Joe Emison, has the only national database of building permits, which updates data from approximately half of the United States on a monthly basis. In order to accommodate the many requests that come in from the sales and marketing team regarding permit information, BuildFax has a technical sales support team that fulfills these requests sent to a ticketing system by manually writing SQL queries that run across the shards of the BuildFax databases. Since there are a large number of requests received by the internal sales support team and due to the manual nature of setting up the queries, it may take several days for getting the sales and marketing teams to receive an answer.

The BuildFax Counts chatbot solves this problem by taking the permit inquiry that would normally be sent into a ticket from the sales and marketing team, as input from Slack to the chatbot. Once the inquiry is submitted into Slack, a query executes and the inquiry results are returned immediately.

Joe built this solution by first creating a nightly export of the data in their BuildFax MySQL RDS database to CSV files that are stored in Amazon S3. From the exported CSV files, an Amazon Athena table was created in order to run quick and efficient queries on the data. He then used Amazon Lex to create a bot to handle the common questions and criteria that may be asked by the sales and marketing teams when seeking data from the BuildFax database by modeling the language used from the BuildFax ticketing system. He added several different sample utterances and slot types; both custom and Lex provided, in order to correctly parse every question and criteria combination that could be received from an inquiry.  Using Lambda, Joe created a Javascript Lambda function that receives information from the Lex intent and used it to build a SQL statement that runs against the aforementioned Athena database using the AWS SDK for JavaScript in Node.js library to return inquiry count result and SQL statement used.

The BuildFax Counts bot is used today for the BuildFax sales and marketing team to get back data on inquiries immediately that previously took up to a week to receive results.

Not only is BuildFax Counts bot our 1st place winner and wonderful solution, but its creator, Joe Emison, is a great guy.  Joe has opted to donate his prize; the $5,000 cash, the $2,500 in AWS Credits, and one re:Invent ticket to the Black Girls Code organization. I must say, you rock Joe for helping these kids get access and exposure to technology.

 

Hubsy by Andrew Riess, Andrew Puch, and John Wetzel

Hubsy bot was created to redefine and personalize the way users traditionally manage their HubSpot account. HubSpot is a SaaS system providing marketing, sales, and CRM software. Hubsy allows users of HubSpot to create engagements and log engagements with customers, provide sales teams with deals status, and retrieves client contact information quickly. Hubsy uses Amazon Lex’s conversational interface to execute commands from the HubSpot API so that users can gain insights, store and retrieve data, and manage tasks directly from Facebook, Slack, or Alexa.

In order to implement the Hubsy chatbot, Andrew and the team members used AWS Lambda to create a Lambda function with Node.js to parse the users request and call the HubSpot API, which will fulfill the initial request or return back to the user asking for more information. Terraform was used to automatically setup and update Lambda, CloudWatch logs, as well as, IAM profiles. Amazon Lex was used to build the conversational piece of the bot, which creates the utterances that a person on a sales team would likely say when seeking information from HubSpot. To integrate with Alexa, the Amazon Alexa skill builder was used to create an Alexa skill which was tested on an Echo Dot. Cloudwatch Logs are used to log the Lambda function information to CloudWatch in order to debug different parts of the Lex intents. In order to validate the code before the Terraform deployment, ESLint was additionally used to ensure the code was linted and proper development standards were followed.

 

PFMBot by Benny Leong and his team from MoneyLion

PFMBot, Personal Finance Management Bot,  is a bot to be used with the MoneyLion finance group which offers customers online financial products; loans, credit monitoring, and free credit score service to improve the financial health of their customers. Once a user signs up an account on the MoneyLion app or website, the user has the option to link their bank accounts with the MoneyLion APIs. Once the bank account is linked to the APIs, the user will be able to login to their MoneyLion account and start having a conversation with the PFMBot based on their bank account information.

The PFMBot UI has a web interface built with using Javascript integration. The chatbot was created using Amazon Lex to build utterances based on the possible inquiries about the user’s MoneyLion bank account. PFMBot uses the Lex built-in AMAZON slots and parsed and converted the values from the built-in slots to pass to AWS Lambda. The AWS Lambda functions interacting with Amazon Lex are Java-based Lambda functions which call the MoneyLion Java-based internal APIs running on Spring Boot. These APIs obtain account data and related bank account information from the MoneyLion MySQL Database.

 

ADP Payroll Innovation Bot by Eric Liu, Jiaxing Yan, and Fan Yang

ADP PI (Payroll Innovation) bot is designed to help employees of ADP customers easily review their own payroll details and compare different payroll data by just asking the bot for results. The ADP PI Bot additionally offers issue reporting functionality for employees to report payroll issues and aids HR managers in quickly receiving and organizing any reported payroll issues.

The ADP Payroll Innovation bot is an ecosystem for the ADP payroll consisting of two chatbots, which includes ADP PI Bot for external clients (employees and HR managers), and ADP PI DevOps Bot for internal ADP DevOps team.


The architecture for the ADP PI DevOps bot is different architecture from the ADP PI bot shown above as it is deployed internally to ADP. The ADP PI DevOps bot allows input from both Slack and Alexa. When input comes into Slack, Slack sends the request to Lex for it to process the utterance. Lex then calls the Lambda backend, which obtains ADP data sitting in the ADP VPC running within an Amazon VPC. When input comes in from Alexa, a Lambda function is called that also obtains data from the ADP VPC running on AWS.

The architecture for the ADP PI bot consists of users entering in requests and/or entering issues via Slack. When requests/issues are entered via Slack, the Slack APIs communicate via Amazon API Gateway to AWS Lambda. The Lambda function either writes data into one of the Amazon DynamoDB databases for recording issues and/or sending issues or it sends the request to Lex. When sending issues, DynamoDB integrates with Trello to keep HR Managers abreast of the escalated issues. Once the request data is sent from Lambda to Lex, Lex processes the utterance and calls another Lambda function that integrates with the ADP API and it calls ADP data from within the ADP VPC, which runs on Amazon Virtual Private Cloud (VPC).

Python and Node.js were the chosen languages for the development of the bots.

The ADP PI bot ecosystem has the following functional groupings:

Employee Functionality

  • Summarize Payrolls
  • Compare Payrolls
  • Escalate Issues
  • Evolve PI Bot

HR Manager Functionality

  • Bot Management
  • Audit and Feedback

DevOps Functionality

  • Reduce call volume in service centers (ADP PI Bot).
  • Track issues and generate reports (ADP PI Bot).
  • Monitor jobs for various environment (ADP PI DevOps Bot)
  • View job dashboards (ADP PI DevOps Bot)
  • Query job details (ADP PI DevOps Bot)

 

Summary

Let’s all wish all the winners of the AWS Chatbot Challenge hearty congratulations on their excellent projects.

You can review more details on the winning projects, as well as, all of the submissions to the AWS Chatbot Challenge at: https://awschatbot2017.devpost.com/submissions. If you are curious on the details of Chatbot challenge contest including resources, rules, prizes, and judges, you can review the original challenge website here:  https://awschatbot2017.devpost.com/.

Hopefully, you are just as inspired as I am to build your own chatbot using Lex and Lambda. For more information, take a look at the Amazon Lex developer guide or the AWS AI blog on Building Better Bots Using Amazon Lex (Part 1)

Chat with you soon!

Tara

Announcement: IPS code

Post Syndicated from Robert Graham original http://blog.erratasec.com/2017/08/announcement-ips-code.html

So after 20 years, IBM is killing off my BlackICE code created in April 1998. So it’s time that I rewrite it.

BlackICE was the first “inline” intrusion-detection system, aka. an “intrusion prevention system” or IPS. ISS purchased my company in 2001 and replaced their RealSecure engine with it, and later renamed it Proventia. Then IBM purchased ISS in 2006. Now, they are formally canceling the project and moving customers onto Cisco’s products, which are based on Snort.

So now is a good time to write a replacement. The reason is that BlackICE worked fundamentally differently than Snort, using protocol analysis rather than pattern-matching. In this way, it worked more like Bro than Snort. The biggest benefit of protocol-analysis is speed, making it many times faster than Snort. The second benefit is better detection ability, as I describe in this post on Heartbleed.

So my plan is to create a new project. I’ll be checking in the starter bits into GitHub starting a couple weeks from now. I need to figure out a new name for the project, so I don’t have to rip off a name from William Gibson like I did last time :).

Some notes:

  • Yes, it’ll be GNU open source. I’m a capitalist, so I’ll earn money like snort/nmap dual-licensing it, charging companies who don’t want to open-source their addons. All capitalists GNU license their code.
  • C, not Rust. Sorry, I’m going for extreme scalability. We’ll re-visit this decision later when looking at building protocol parsers.
  • It’ll be 95% compatible with Snort signatures. Their language definition leaves so much ambiguous it’ll be hard to be 100% compatible.
  • It’ll support Snort output as well, though really, Snort’s events suck.
  • Protocol parsers in Lua, so you can use it as a replacement for Bro, writing parsers to extract data you are interested in.
  • Protocol state machine parsers in C, like you see in my Masscan project for X.509.
  • First version IDS only. These days, “inline” means also being able to MitM the SSL stack, so I’m gong to have to think harder on that.
  • Mutli-core worker threads off PF_RING/DPDK/netmap receive queues. Should handle 10gbps, tracking 10 million concurrent connections, with quad-core CPU.
So if you want to contribute to the project, here’s what I need:
  • Requirements from people who work daily with IDS/IPS today. I need you to write up what your products do well that you really like. I need to you write up what they suck at that needs to be fixed. These need to be in some detail.
  • Testing environment to play with. This means having a small server plugged into a real-world link running at a minimum of several gigabits-per-second available for the next year. I’ll sign NDAs related to the data I might see on the network.
  • Coders. I’ll be doing the basic architecture, but protocol parsers, output plugins, etc. will need work. Code will be in C and Lua for the near term. Unfortunately, since I’m going to dual-license, I’ll need waivers before accepting pull requests.
Anyway, follow me on Twitter @erratarob if you want to contribute.

Porn Producer Says He’ll Prove That AMC TV Exec is a BitTorrent Pirate

Post Syndicated from Andy original https://torrentfreak.com/porn-producer-says-hell-prove-that-amc-tv-exec-is-a-bittorrent-pirate-170818/

When people are found sharing copyrighted pornographic content online in the United States, there’s always a chance that an angry studio will attempt to track down the perpertrator in pursuit of a cash settlement.

That’s what adult studio Flava Works did recently, after finding its content being shared without permission on a number of gay-focused torrent sites. It’s now clear that their target was Marc Juris, President & General Manager of AMC-owned WE tv. Until this week, however, that information was secret.

As detailed in our report yesterday, Flava Works contacted Juris with an offer of around $97,000 to settle the case before trial. And, crucially, before Juris was publicly named in a lawsuit. If Juris decided not to pay, that amount would increase significantly, Flava Works CEO Phillip Bleicher told him at the time.

Not only did Juris not pay, he actually went on the offensive, filing a ‘John Doe’ complaint in a California district court which accused Flava Works of extortion and blackmail. It’s possible that Juris felt that this would cause Flava Works to back off but in fact, it had quite the opposite effect.

In a complaint filed this week in an Illinois district court, Flava Works named Juris and accused him of a broad range of copyright infringement offenses.

The complaint alleges that Juris was a signed-up member of Flava Works’ network of websites, from where he downloaded pornographic content as his subscription allowed. However, it’s claimed that Juris then uploaded this material elsewhere, in breach of copyright law.

“Defendant downloaded copyrighted videos of Flava Works as part of his paid memberships and, in violation of the terms and conditions of the paid sites, posted and distributed the aforesaid videos on other websites, including websites with peer to peer sharing and torrents technology,” the complaint reads.

“As a result of Defendant’ conduct, third parties were able to download the copyrighted videos, without permission of Flava Works.”

In addition to demanding injunctions against Juris, Flava Works asks the court for a judgment in its favor amounting to a cool $1.2m, more than twelve times the amount it was initially prepared to settle for. It’s a huge amount, but according to CEO Phillip Bleicher, it’s what his company is owed, despite Juris being a former customer.

“Juris was a member of various Flava Works websites at various times dating back to 2006. He is no longer a member and his login info has been blocked by us to prevent him from re-joining,” Bleicher informs TF.

“We allow full downloads, although each download a person performs, it tags the video with a hidden code that identifies who the user was that downloaded it and their IP info and date / time.”

We asked Bleicher how he can be sure that the content downloaded from Flava Works and re-uploaded elsewhere was actually uploaded by Juris. Fine details weren’t provided but he’s insistent that the company’s evidence holds up.

“We identified him directly, this was done by cross referencing all his IP logins with Flava Works, his email addresses he used and his usernames. We can confirm that he is/was a member of Gay-Torrents.org and Gayheaven.org. We also believe (we will find out in discovery) that he is a member of a Russian file sharing site called GayTorrent.Ru,” he says.

While the technicalities of who downloaded and shared what will be something for the court to decide, there’s still Juris’ allegations that Bleicher used extortion-like practices to get him to settle and used his relative fame against him. Bleicher says that’s not how things played out.

“[Juris] hired an attorney and they agreed to settle out of court. But then we saw him still accessing the file sharing sites (one site shows a user’s last login) and we were waiting on the settlement agreement to be drafted up by his attorney,” he explains.

“When he kept pushing the date of when we would see an agreement back we gave him a final deadline and said that after this date we would sue [him] and with all lawsuits – we make a press release.”

Bleicher says at this point Juris replaced his legal team and hired lawyer Mark Geragos, who Bleicher says tried to “bully” him, warning him of potential criminal offenses.

“Your threats in the last couple months to ‘expose’ Mr. Juris knowing he is a high profile individual, i.e., today you threatened to issue a press release, to induce him into wiring you close to $100,000 is outright extortion and subject to criminal prosecution,” Geragos wrote.

“I suggest you direct your attention to various statutes which specifically criminalize your conduct in the various jurisdictions where you have threatened suit.”

Interestingly, Geragos then went on to suggest that the lawsuit may ultimately backfire, since going public might affect Flava Works’ reputation in the gay market.

“With respect to Mr. Juris, your actions have been nothing but extortion and we reject your attempts and will vigorously pursue all available remedies against you,” Geragos’ email reads.

“We intend to use the platform you have provided to raise awareness in the LGBTQ community of this new form of digital extortion that you promote.”

But Bleicher, it seems, is up for a fight.

“Marc knows what he did and enjoyed downloading our videos and sharing them and those of videos of other studios, but now he has been caught,” he told the lawyer.

“This is the kind of case I would like to take all the way to trial, win or lose. It shows
people that want to steal our copyrighted videos that we aggressively protect our intellectual property.”

But to the tune of $1.2m? Apparently so.

“We could get up to $150,000 per infringement – we have solid proof of eight full videos – not to mention we have caught [Juris] downloading many other studios’ videos too – I think – but not sure – the number was over 75,” Bleicher told TF.

It’s quite rare for this kind of dispute to play out in public, especially considering Juris’ profile and occupation. Only time will tell if this will ultimately end in a settlement, but Bleicher and Juris seemed determined at this stage to stand by their ground and fight this out in court.

Complaint (pdf)

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Court Orders Aussie ISPs to Block Dozens of Pirate Sites

Post Syndicated from Ernesto original https://torrentfreak.com/court-orders-aussie-isps-to-block-dozens-of-pirate-sites-170818/

Rather than taking site operators to court, copyright holders increasingly demand that Internet providers should block access to ‘pirate’ domains.

As a result, courts all around the world have ordered ISPs to block subscriber access to various pirate sites.

This is also happening in Australia where the first blockades were issued late last year. In December, the Federal Court ordered ISPs to block The Pirate Bay and several other sites, which happened soon after.

However, as is often the case with website blocking, one order is not enough as there are still plenty of pirate sites and proxies readily available. So, several rightsholders including movie studio Village Roadshow and local broadcaster Foxtel went back to court.

Today the Federal Court ruled on two applications that cover 59 pirate sites in total, including many popular torrent and streaming portals.

The first order was issued by Justice John Nicholas, who directed several Internet providers including IINet, Telstra, and TPG to block access to several pirate sites. The request came from Village Roadshow, which was backed by several major Hollywood studios.

The order directs the ISPs to stop passing on traffic to 41 torrent and streaming platforms including Demonoid, RARBG, EZTV, YTS, Gomovies, and Fmovies. The full list of blocked domains is even longer, as it also covers several proxies.

“The infringement or facilitation of infringement by the Online Locations is flagrant and reflect a blatant disregard for the rights of copyright owners,” the order reads.

“By way of illustration, one of the Online Locations is accessible via the domain name ‘istole.it’ and it and many others include notices encouraging users to implement technology to frustrate any legal action that might be taken by copyright owners.”

In a separate order handed down by Federal Court Judge Stephen Burley, another 17 sites are ordered blocked following a request from Foxtel. This includes popular pirate sites such as 1337x, Torlock, Putlocker, YesMovies, Vumoo, and LosMovies.

The second order also includes a wide variety of alternative locations, including proxies, which brings the total number of targeted domain names to more than 160.

As highlighted by SHM, the orders coincide with the launch of a new anti-piracy campaign dubbed “The Price of Piracy,” which is organized by Creative Content Australia. Lori Flekser, Executive director of the non-profit organization, believes that the blockades will help to significantly deter piracy.

“Not only is there decreasing traffic to pirate sites but there is a subsequent increase in traffic to legal sites,” she said.

At the same time, she warns people not to visit proxy and mirror sites, as these could be dangerous. This message is also repeated by her organization’s campaign, which warns that pirate sites can be filled with ransomware, spyware, trojans, viruses, bots, rootkits and worms.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

timeShift(GrafanaBuzz, 1w) Issue 9

Post Syndicated from Blogs on Grafana Labs Blog original https://grafana.com/blog/2017/08/18/timeshiftgrafanabuzz-1w-issue-9/

Matt from Grafana NYC spent the week visiting Stockholm to focus on v5.0 with Torkel. Despite warnings otherwise, the weather has been beautiful, making a nice backdrop for many UX discussions. Very, very excited to soon show what we’ve been working on.


Latest Release

Grafana v4.4.3 is Available for download

To see the full changelog, head over to our community site.


Grafana <3 Prometheus

Our very own Carl Bergquist spoke at PromCon 2017 yesterday in Munich, highlighting recent Grafana features and enhancements.

We also used the opportunity to debut our coming Prometheus query editor with a load of new functionality; seems the community approves,
in fact this is our most popular tweet ever!


From the Blogosphere

  • Wikimedia Metrics: A tweet this week reminded us of the public metrics Wikimedia exposes using Grafana. Exploring the performance stats in real time for the 5th mot popular site on the internet is pretty fun.

  • Creating Grafana Annotations with InfluxDB: Nice short article by Max Chadwick showing how to quickly add InfluxDB as a source for Grafana annotations.


This week’s MVC (Most Valuable Contributor)

This week’s MVC highlights what is great about Open Source software.

ericslaw
ericslaw submitted his first PR to a public project this past week. Speaking from personal experience, submitting a PR can feel daunting and and we were lucky that he chose Grafana. Even the smallest contributions, like Eric fixing a bogus link within our templating has big impact.


Tweet of the Week

We scour Twitter each week to find an interesting/beautiful dashboard and show it off! #monitoringLove

Seems the excitement about Prometheus and Grafana has also caught the attention of a certain superhero.



What do you think?

That wraps up another issue. Hope you’re finding these roundups valuable. Let us know how we’re doing! Submit a comment on this article below, or post something at our community forum. Help us make this better!

Follow us on Twitter, like us on Facebook, and join the Grafana Labs community.

Analyzing AWS Cost and Usage Reports with Looker and Amazon Athena

Post Syndicated from Dillon Morrison original https://aws.amazon.com/blogs/big-data/analyzing-aws-cost-and-usage-reports-with-looker-and-amazon-athena/

This is a guest post by Dillon Morrison at Looker. Looker is, in their own words, “a new kind of analytics platform–letting everyone in your business make better decisions by getting reliable answers from a tool they can use.” 

As the breadth of AWS products and services continues to grow, customers are able to more easily move their technology stack and core infrastructure to AWS. One of the attractive benefits of AWS is the cost savings. Rather than paying upfront capital expenses for large on-premises systems, customers can instead pay variables expenses for on-demand services. To further reduce expenses AWS users can reserve resources for specific periods of time, and automatically scale resources as needed.

The AWS Cost Explorer is great for aggregated reporting. However, conducting analysis on the raw data using the flexibility and power of SQL allows for much richer detail and insight, and can be the better choice for the long term. Thankfully, with the introduction of Amazon Athena, monitoring and managing these costs is now easier than ever.

In the post, I walk through setting up the data pipeline for cost and usage reports, Amazon S3, and Athena, and discuss some of the most common levers for cost savings. I surface tables through Looker, which comes with a host of pre-built data models and dashboards to make analysis of your cost and usage data simple and intuitive.

Analysis with Athena

With Athena, there’s no need to create hundreds of Excel reports, move data around, or deploy clusters to house and process data. Athena uses Apache Hive’s DDL to create tables, and the Presto querying engine to process queries. Analysis can be performed directly on raw data in S3. Conveniently, AWS exports raw cost and usage data directly into a user-specified S3 bucket, making it simple to start querying with Athena quickly. This makes continuous monitoring of costs virtually seamless, since there is no infrastructure to manage. Instead, users can leverage the power of the Athena SQL engine to easily perform ad-hoc analysis and data discovery without needing to set up a data warehouse.

After the data pipeline is established, cost and usage data (the recommended billing data, per AWS documentation) provides a plethora of comprehensive information around usage of AWS services and the associated costs. Whether you need the report segmented by product type, user identity, or region, this report can be cut-and-sliced any number of ways to properly allocate costs for any of your business needs. You can then drill into any specific line item to see even further detail, such as the selected operating system, tenancy, purchase option (on-demand, spot, or reserved), and so on.

Walkthrough

By default, the Cost and Usage report exports CSV files, which you can compress using gzip (recommended for performance). There are some additional configuration options for tuning performance further, which are discussed below.

Prerequisites

If you want to follow along, you need the following resources:

Enable the cost and usage reports

First, enable the Cost and Usage report. For Time unit, select Hourly. For Include, select Resource IDs. All options are prompted in the report-creation window.

The Cost and Usage report dumps CSV files into the specified S3 bucket. Please note that it can take up to 24 hours for the first file to be delivered after enabling the report.

Configure the S3 bucket and files for Athena querying

In addition to the CSV file, AWS also creates a JSON manifest file for each cost and usage report. Athena requires that all of the files in the S3 bucket are in the same format, so we need to get rid of all these manifest files. If you’re looking to get started with Athena quickly, you can simply go into your S3 bucket and delete the manifest file manually, skip the automation described below, and move on to the next section.

To automate the process of removing the manifest file each time a new report is dumped into S3, which I recommend as you scale, there are a few additional steps. The folks at Concurrency labs wrote a great overview and set of scripts for this, which you can find in their GitHub repo.

These scripts take the data from an input bucket, remove anything unnecessary, and dump it into a new output bucket. We can utilize AWS Lambda to trigger this process whenever new data is dropped into S3, or on a nightly basis, or whatever makes most sense for your use-case, depending on how often you’re querying the data. Please note that enabling the “hourly” report means that data is reported at the hour-level of granularity, not that a new file is generated every hour.

Following these scripts, you’ll notice that we’re adding a date partition field, which isn’t necessary but improves query performance. In addition, converting data from CSV to a columnar format like ORC or Parquet also improves performance. We can automate this process using Lambda whenever new data is dropped in our S3 bucket. Amazon Web Services discusses columnar conversion at length, and provides walkthrough examples, in their documentation.

As a long-term solution, best practice is to use compression, partitioning, and conversion. However, for purposes of this walkthrough, we’re not going to worry about them so we can get up-and-running quicker.

Set up the Athena query engine

In your AWS console, navigate to the Athena service, and click “Get Started”. Follow the tutorial and set up a new database (we’ve called ours “AWS Optimizer” in this example). Don’t worry about configuring your initial table, per the tutorial instructions. We’ll be creating a new table for cost and usage analysis. Once you walked through the tutorial steps, you’ll be able to access the Athena interface, and can begin running Hive DDL statements to create new tables.

One thing that’s important to note, is that the Cost and Usage CSVs also contain the column headers in their first row, meaning that the column headers would be included in the dataset and any queries. For testing and quick set-up, you can remove this line manually from your first few CSV files. Long-term, you’ll want to use a script to programmatically remove this row each time a new file is dropped in S3 (every few hours typically). We’ve drafted up a sample script for ease of reference, which we run on Lambda. We utilize Lambda’s native ability to invoke the script whenever a new object is dropped in S3.

For cost and usage, we recommend using the DDL statement below. Since our data is in CSV format, we don’t need to use a SerDe, we can simply specify the “separatorChar, quoteChar, and escapeChar”, and the structure of the files (“TEXTFILE”). Note that AWS does have an OpenCSV SerDe as well, if you prefer to use that.

 

CREATE EXTERNAL TABLE IF NOT EXISTS cost_and_usage	 (
identity_LineItemId String,
identity_TimeInterval String,
bill_InvoiceId String,
bill_BillingEntity String,
bill_BillType String,
bill_PayerAccountId String,
bill_BillingPeriodStartDate String,
bill_BillingPeriodEndDate String,
lineItem_UsageAccountId String,
lineItem_LineItemType String,
lineItem_UsageStartDate String,
lineItem_UsageEndDate String,
lineItem_ProductCode String,
lineItem_UsageType String,
lineItem_Operation String,
lineItem_AvailabilityZone String,
lineItem_ResourceId String,
lineItem_UsageAmount String,
lineItem_NormalizationFactor String,
lineItem_NormalizedUsageAmount String,
lineItem_CurrencyCode String,
lineItem_UnblendedRate String,
lineItem_UnblendedCost String,
lineItem_BlendedRate String,
lineItem_BlendedCost String,
lineItem_LineItemDescription String,
lineItem_TaxType String,
product_ProductName String,
product_accountAssistance String,
product_architecturalReview String,
product_architectureSupport String,
product_availability String,
product_bestPractices String,
product_cacheEngine String,
product_caseSeverityresponseTimes String,
product_clockSpeed String,
product_currentGeneration String,
product_customerServiceAndCommunities String,
product_databaseEdition String,
product_databaseEngine String,
product_dedicatedEbsThroughput String,
product_deploymentOption String,
product_description String,
product_durability String,
product_ebsOptimized String,
product_ecu String,
product_endpointType String,
product_engineCode String,
product_enhancedNetworkingSupported String,
product_executionFrequency String,
product_executionLocation String,
product_feeCode String,
product_feeDescription String,
product_freeQueryTypes String,
product_freeTrial String,
product_frequencyMode String,
product_fromLocation String,
product_fromLocationType String,
product_group String,
product_groupDescription String,
product_includedServices String,
product_instanceFamily String,
product_instanceType String,
product_io String,
product_launchSupport String,
product_licenseModel String,
product_location String,
product_locationType String,
product_maxIopsBurstPerformance String,
product_maxIopsvolume String,
product_maxThroughputvolume String,
product_maxVolumeSize String,
product_maximumStorageVolume String,
product_memory String,
product_messageDeliveryFrequency String,
product_messageDeliveryOrder String,
product_minVolumeSize String,
product_minimumStorageVolume String,
product_networkPerformance String,
product_operatingSystem String,
product_operation String,
product_operationsSupport String,
product_physicalProcessor String,
product_preInstalledSw String,
product_proactiveGuidance String,
product_processorArchitecture String,
product_processorFeatures String,
product_productFamily String,
product_programmaticCaseManagement String,
product_provisioned String,
product_queueType String,
product_requestDescription String,
product_requestType String,
product_routingTarget String,
product_routingType String,
product_servicecode String,
product_sku String,
product_softwareType String,
product_storage String,
product_storageClass String,
product_storageMedia String,
product_technicalSupport String,
product_tenancy String,
product_thirdpartySoftwareSupport String,
product_toLocation String,
product_toLocationType String,
product_training String,
product_transferType String,
product_usageFamily String,
product_usagetype String,
product_vcpu String,
product_version String,
product_volumeType String,
product_whoCanOpenCases String,
pricing_LeaseContractLength String,
pricing_OfferingClass String,
pricing_PurchaseOption String,
pricing_publicOnDemandCost String,
pricing_publicOnDemandRate String,
pricing_term String,
pricing_unit String,
reservation_AvailabilityZone String,
reservation_NormalizedUnitsPerReservation String,
reservation_NumberOfReservations String,
reservation_ReservationARN String,
reservation_TotalReservedNormalizedUnits String,
reservation_TotalReservedUnits String,
reservation_UnitsPerReservation String,
resourceTags_userName String,
resourceTags_usercostcategory String  


)
    ROW FORMAT DELIMITED
      FIELDS TERMINATED BY ','
      ESCAPED BY '\\'
      LINES TERMINATED BY '\n'

STORED AS TEXTFILE
    LOCATION 's3://<<your bucket name>>';

Once you’ve successfully executed the command, you should see a new table named “cost_and_usage” with the below properties. Now we’re ready to start executing queries and running analysis!

Start with Looker and connect to Athena

Setting up Looker is a quick process, and you can try it out for free here (or download from Amazon Marketplace). It takes just a few seconds to connect Looker to your Athena database, and Looker comes with a host of pre-built data models and dashboards to make analysis of your cost and usage data simple and intuitive. After you’re connected, you can use the Looker UI to run whatever analysis you’d like. Looker translates this UI to optimized SQL, so any user can execute and visualize queries for true self-service analytics.

Major cost saving levers

Now that the data pipeline is configured, you can dive into the most popular use cases for cost savings. In this post, I focus on:

  • Purchasing Reserved Instances vs. On-Demand Instances
  • Data transfer costs
  • Allocating costs over users or other Attributes (denoted with resource tags)

On-Demand, Spot, and Reserved Instances

Purchasing Reserved Instances vs On-Demand Instances is arguably going to be the biggest cost lever for heavy AWS users (Reserved Instances run up to 75% cheaper!). AWS offers three options for purchasing instances:

  • On-Demand—Pay as you use.
  • Spot (variable cost)—Bid on spare Amazon EC2 computing capacity.
  • Reserved Instances—Pay for an instance for a specific, allotted period of time.

When purchasing a Reserved Instance, you can also choose to pay all-upfront, partial-upfront, or monthly. The more you pay upfront, the greater the discount.

If your company has been using AWS for some time now, you should have a good sense of your overall instance usage on a per-month or per-day basis. Rather than paying for these instances On-Demand, you should try to forecast the number of instances you’ll need, and reserve them with upfront payments.

The total amount of usage with Reserved Instances versus overall usage with all instances is called your coverage ratio. It’s important not to confuse your coverage ratio with your Reserved Instance utilization. Utilization represents the amount of reserved hours that were actually used. Don’t worry about exceeding capacity, you can still set up Auto Scaling preferences so that more instances get added whenever your coverage or utilization crosses a certain threshold (we often see a target of 80% for both coverage and utilization among savvy customers).

Calculating the reserved costs and coverage can be a bit tricky with the level of granularity provided by the cost and usage report. The following query shows your total cost over the last 6 months, broken out by Reserved Instance vs other instance usage. You can substitute the cost field for usage if you’d prefer. Please note that you should only have data for the time period after the cost and usage report has been enabled (though you can opt for up to 3 months of historical data by contacting your AWS Account Executive). If you’re just getting started, this query will only show a few days.

 

SELECT 
	DATE_FORMAT(from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate),'%Y-%m') AS "cost_and_usage.usage_start_month",
	COALESCE(SUM(cost_and_usage.lineitem_unblendedcost ), 0) AS "cost_and_usage.total_unblended_cost",
	COALESCE(SUM(CASE WHEN (CASE
         WHEN cost_and_usage.lineitem_lineitemtype = 'DiscountedUsage' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'RIFee' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'Fee' THEN 'RI Line Item'
         ELSE 'Non RI Line Item'
        END = 'RI Line Item') THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0) AS "cost_and_usage.total_reserved_unblended_cost",
	1.0 * (COALESCE(SUM(CASE WHEN (CASE
         WHEN cost_and_usage.lineitem_lineitemtype = 'DiscountedUsage' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'RIFee' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'Fee' THEN 'RI Line Item'
         ELSE 'Non RI Line Item'
        END = 'RI Line Item') THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0)) / NULLIF((COALESCE(SUM(cost_and_usage.lineitem_unblendedcost ), 0)),0)  AS "cost_and_usage.percent_spend_on_ris",
	COALESCE(SUM(CASE WHEN (CASE
         WHEN cost_and_usage.lineitem_lineitemtype = 'DiscountedUsage' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'RIFee' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'Fee' THEN 'RI Line Item'
         ELSE 'Non RI Line Item'
        END = 'Non RI Line Item') THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0) AS "cost_and_usage.total_non_reserved_unblended_cost",
	1.0 * (COALESCE(SUM(CASE WHEN (CASE
         WHEN cost_and_usage.lineitem_lineitemtype = 'DiscountedUsage' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'RIFee' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'Fee' THEN 'RI Line Item'
         ELSE 'Non RI Line Item'
        END = 'Non RI Line Item') THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0)) / NULLIF((COALESCE(SUM(cost_and_usage.lineitem_unblendedcost ), 0)),0)  AS "cost_and_usage.percent_spend_on_non_ris"
FROM aws_optimizer.cost_and_usage  AS cost_and_usage

WHERE 
	(((from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate)) >= ((DATE_ADD('month', -5, DATE_TRUNC('MONTH', CAST(NOW() AS DATE))))) AND (from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate)) < ((DATE_ADD('month', 6, DATE_ADD('month', -5, DATE_TRUNC('MONTH', CAST(NOW() AS DATE))))))))
GROUP BY 1
ORDER BY 2 DESC
LIMIT 500

The resulting table should look something like the image below (I’m surfacing tables through Looker, though the same table would result from querying via command line or any other interface).

With a BI tool, you can create dashboards for easy reference and monitoring. New data is dumped into S3 every few hours, so your dashboards can update several times per day.

It’s an iterative process to understand the appropriate number of Reserved Instances needed to meet your business needs. After you’ve properly integrated Reserved Instances into your purchasing patterns, the savings can be significant. If your coverage is consistently below 70%, you should seriously consider adjusting your purchase types and opting for more Reserved instances.

Data transfer costs

One of the great things about AWS data storage is that it’s incredibly cheap. Most charges often come from moving and processing that data. There are several different prices for transferring data, broken out largely by transfers between regions and availability zones. Transfers between regions are the most costly, followed by transfers between Availability Zones. Transfers within the same region and same availability zone are free unless using elastic or public IP addresses, in which case there is a cost. You can find more detailed information in the AWS Pricing Docs. With this in mind, there are several simple strategies for helping reduce costs.

First, since costs increase when transferring data between regions, it’s wise to ensure that as many services as possible reside within the same region. The more you can localize services to one specific region, the lower your costs will be.

Second, you should maximize the data you’re routing directly within AWS services and IP addresses. Transfers out to the open internet are the most costly and least performant mechanisms of data transfers, so it’s best to keep transfers within AWS services.

Lastly, data transfers between private IP addresses are cheaper than between elastic or public IP addresses, so utilizing private IP addresses as much as possible is the most cost-effective strategy.

The following query provides a table depicting the total costs for each AWS product, broken out transfer cost type. Substitute the “lineitem_productcode” field in the query to segment the costs by any other attribute. If you notice any unusually high spikes in cost, you’ll need to dig deeper to understand what’s driving that spike: location, volume, and so on. Drill down into specific costs by including “product_usagetype” and “product_transfertype” in your query to identify the types of transfer costs that are driving up your bill.

SELECT 
	cost_and_usage.lineitem_productcode  AS "cost_and_usage.product_code",
	COALESCE(SUM(cost_and_usage.lineitem_unblendedcost), 0) AS "cost_and_usage.total_unblended_cost",
	COALESCE(SUM(CASE WHEN REGEXP_LIKE(cost_and_usage.product_usagetype, 'DataTransfer')    THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0) AS "cost_and_usage.total_data_transfer_cost",
	COALESCE(SUM(CASE WHEN REGEXP_LIKE(cost_and_usage.product_usagetype, 'DataTransfer-In')    THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0) AS "cost_and_usage.total_inbound_data_transfer_cost",
	COALESCE(SUM(CASE WHEN REGEXP_LIKE(cost_and_usage.product_usagetype, 'DataTransfer-Out')    THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0) AS "cost_and_usage.total_outbound_data_transfer_cost"
FROM aws_optimizer.cost_and_usage  AS cost_and_usage

WHERE 
	(((from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate)) >= ((DATE_ADD('month', -5, DATE_TRUNC('MONTH', CAST(NOW() AS DATE))))) AND (from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate)) < ((DATE_ADD('month', 6, DATE_ADD('month', -5, DATE_TRUNC('MONTH', CAST(NOW() AS DATE))))))))
GROUP BY 1
ORDER BY 2 DESC
LIMIT 500

When moving between regions or over the open web, many data transfer costs also include the origin and destination location of the data movement. Using a BI tool with mapping capabilities, you can get a nice visual of data flows. The point at the center of the map is used to represent external data flows over the open internet.

Analysis by tags

AWS provides the option to apply custom tags to individual resources, so you can allocate costs over whatever customized segment makes the most sense for your business. For a SaaS company that hosts software for customers on AWS, maybe you’d want to tag the size of each customer. The following query uses custom tags to display the reserved, data transfer, and total cost for each AWS service, broken out by tag categories, over the last 6 months. You’ll want to substitute the cost_and_usage.resourcetags_customersegment and cost_and_usage.customer_segment with the name of your customer field.

 

SELECT * FROM (
SELECT *, DENSE_RANK() OVER (ORDER BY z___min_rank) as z___pivot_row_rank, RANK() OVER (PARTITION BY z__pivot_col_rank ORDER BY z___min_rank) as z__pivot_col_ordering FROM (
SELECT *, MIN(z___rank) OVER (PARTITION BY "cost_and_usage.product_code") as z___min_rank FROM (
SELECT *, RANK() OVER (ORDER BY CASE WHEN z__pivot_col_rank=1 THEN (CASE WHEN "cost_and_usage.total_unblended_cost" IS NOT NULL THEN 0 ELSE 1 END) ELSE 2 END, CASE WHEN z__pivot_col_rank=1 THEN "cost_and_usage.total_unblended_cost" ELSE NULL END DESC, "cost_and_usage.total_unblended_cost" DESC, z__pivot_col_rank, "cost_and_usage.product_code") AS z___rank FROM (
SELECT *, DENSE_RANK() OVER (ORDER BY CASE WHEN "cost_and_usage.customer_segment" IS NULL THEN 1 ELSE 0 END, "cost_and_usage.customer_segment") AS z__pivot_col_rank FROM (
SELECT 
	cost_and_usage.lineitem_productcode  AS "cost_and_usage.product_code",
	cost_and_usage.resourcetags_customersegment  AS "cost_and_usage.customer_segment",
	COALESCE(SUM(cost_and_usage.lineitem_unblendedcost ), 0) AS "cost_and_usage.total_unblended_cost",
	1.0 * (COALESCE(SUM(CASE WHEN REGEXP_LIKE(cost_and_usage.product_usagetype, 'DataTransfer')    THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0)) / NULLIF((COALESCE(SUM(cost_and_usage.lineitem_unblendedcost ), 0)),0)  AS "cost_and_usage.percent_spend_data_transfers_unblended",
	1.0 * (COALESCE(SUM(CASE WHEN (CASE
         WHEN cost_and_usage.lineitem_lineitemtype = 'DiscountedUsage' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'RIFee' THEN 'RI Line Item'
         WHEN cost_and_usage.lineitem_lineitemtype = 'Fee' THEN 'RI Line Item'
         ELSE 'Non RI Line Item'
        END = 'Non RI Line Item') THEN cost_and_usage.lineitem_unblendedcost  ELSE NULL END), 0)) / NULLIF((COALESCE(SUM(cost_and_usage.lineitem_unblendedcost ), 0)),0)  AS "cost_and_usage.unblended_percent_spend_on_ris"
FROM aws_optimizer.cost_and_usage_raw  AS cost_and_usage

WHERE 
	(((from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate)) >= ((DATE_ADD('month', -5, DATE_TRUNC('MONTH', CAST(NOW() AS DATE))))) AND (from_iso8601_timestamp(cost_and_usage.lineitem_usagestartdate)) < ((DATE_ADD('month', 6, DATE_ADD('month', -5, DATE_TRUNC('MONTH', CAST(NOW() AS DATE))))))))
GROUP BY 1,2) ww
) bb WHERE z__pivot_col_rank <= 16384
) aa
) xx
) zz
 WHERE z___pivot_row_rank <= 500 OR z__pivot_col_ordering = 1 ORDER BY z___pivot_row_rank

The resulting table in this example looks like the results below. In this example, you can tell that we’re making poor use of Reserved Instances because they represent such a small portion of our overall costs.

Again, using a BI tool to visualize these costs and trends over time makes the analysis much easier to consume and take action on.

Summary

Saving costs on your AWS spend is always an iterative, ongoing process. Hopefully with these queries alone, you can start to understand your spending patterns and identify opportunities for savings. However, this is just a peek into the many opportunities available through analysis of the Cost and Usage report. Each company is different, with unique needs and usage patterns. To achieve maximum cost savings, we encourage you to set up an analytics environment that enables your team to explore all potential cuts and slices of your usage data, whenever it’s necessary. Exploring different trends and spikes across regions, services, user types, etc. helps you gain comprehensive understanding of your major cost levers and consistently implement new cost reduction strategies.

Note that all of the queries and analysis provided in this post were generated using the Looker data platform. If you’re already a Looker customer, you can get all of this analysis, additional pre-configured dashboards, and much more using Looker Blocks for AWS.


About the Author

Dillon Morrison leads the Platform Ecosystem at Looker. He enjoys exploring new technologies and architecting the most efficient data solutions for the business needs of his company and their customers. In his spare time, you’ll find Dillon rock climbing in the Bay Area or nose deep in the docs of the latest AWS product release at his favorite cafe (“Arlequin in SF is unbeatable!”).

 

 

 

Cloudflare Kicking ‘Daily Stormer’ is Bad News For Pirate Sites

Post Syndicated from Ernesto original https://torrentfreak.com/cloudflare-kicking-daily-stormer-is-bad-news-for-pirate-sites-170817/

“I woke up this morning in a bad mood and decided to kick them off the Internet.”

Those are the words of Cloudflare CEO Matthew Prince, who decided to terminate the account of controversial Neo-Nazi site Daily Stormer.

Bam. Gone. At least for a while.

Although many people are happy to see the site go offline, the decision is not without consequence. It goes directly against what many saw as the core values of the company.

For years on end, Cloudflare has been asked to remove terrorist propaganda, pirate sites, and other possibly unacceptable content. Each time, Cloudflare replied that it doesn’t take action without a court order. No exceptions.

“Even if it were able to, Cloudfare does not monitor, evaluate, judge or store content appearing on a third party website,” the company wrote just a few weeks ago, in its whitepaper on intermediary liability.

“We’re the plumbers of the internet. We make the pipes work but it’s not right for us to inspect what is or isn’t going through the pipes,” Cloudflare CEO Matthew Prince himself said not too long ago.

“If companies like ours or ISPs start censoring there would be an uproar. It would lead us down a path of internet censors and controls akin to a country like China,” he added.

The same arguments were repeated in different contexts, over and over.

This strong position was also one of the reasons why Cloudflare was dragged into various copyright infringement court cases. In these cases, the company repeatedly stressed that removing a site from Cloudflare’s service would not make infringing content disappear.

Pirate sites would just require a simple DNS reconfiguration to continue their operation, after all.

“[T]here are no measures of any kind that CloudFlare could take to prevent this alleged infringement, because the termination of CloudFlare’s CDN services would have no impact on the existence and ability of these allegedly infringing websites to continue to operate,” it said.

That comment looks rather misplaced now that the CEO of the same company has decided to “kick” a website “off the Internet” after an emotional, but deliberate, decision.

Taking a page from Cloudflare’s (old) playbook we’re not going to make any judgments here. Just search Twitter or any social media site and you’ll see plenty of opinions, both for and against the company’s actions.

We do have a prediction though. During the months and years to come, Cloudflare is likely to be dragged into many more copyright lawsuits, and when they are, their counterparts are going to bring up Cloudflare’s voluntary decision to kick a website off the Internet.

Unless Cloudflare suddenly decides to pull all pirate sites from its service tomorrow, of course.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

“Public Figure” Threatened With Exposure Over Gay Piracy ‘Fine’

Post Syndicated from Andy original https://torrentfreak.com/public-figure-threatened-with-exposure-over-gay-piracy-fine-170817/

Flava Works is an Illinois-based company specializing in adult material featuring black and Latino men. It operates an aggressive anti-piracy strategy which has resulted in some large damages claims in the past.

Now, however, the company has found itself targeted by a lawsuit filed by one of its alleged victims. Filed in a California district court by an unnamed individual, it accuses Flava Works of shocking behavior relating to a claim of alleged piracy.

According to the lawsuit, ‘John Doe’ received a letter in early June from Flava Works CEO Phillip Bleicher, accusing him of Internet piracy. Titled “Settlement Demand and Cease and Desist”, the letter got straight to the point.

“Flava Works is aware that you have been ‘pirating’ the content from its website(s) for your own personal financial benefit,” the letter read.

[Update: ‘John Doe’ has now been identified as Marc Juris, President & General Manager of AMC-owned WE tv. All references to John Doe below refer to Juris. See note at footer]

As is often the case with such claims, Flava Works offered to settle with John Doe for a cash fee. However, instead of the few hundred or thousand dollars usually seen in such cases, the initial settlement amount was an astronomical $97,000. But that wasn’t all.

According to John Doe, Bleicher warned that unless the money was paid in ten days, Flava Works “would initiate litigation against [John Doe], publically accusing him of being a consumer and pirate of copyrighted gay adult entertainment.”

Amping up the pressure, Bleicher then warned that after the ten-day deadline had passed, the settlement amount of $97,000 would be withdrawn and replaced with a new amount – $525,000.

The lawsuit alleges that Bleicher followed up with more emails in which he indicated that there was still time to settle the matter “one on one” since the case hadn’t been assigned to an attorney. However, he warned John Doe that time was running out and that public exposure via a lawsuit would be the next step.

While these kinds of tactics are nothing new in copyright infringement cases, the amounts of money involved are huge, indicating something special at play. Indeed, it transpires that John Doe is a public figure in the entertainment industry and the suggestion is that Flava Works’ assessment of his “wealth and profile” means he can pay these large sums.

According to the suit, on July 6, 2017, Bleicher sent another email to John Doe which “alluded to [his] high-profile status and to the potential publicity that a lawsuit would bring.” The email went as far as threatening an imminent Flava Works press release, announcing that a public figure, who would be named, was being sued for pirating gay adult content.

Flava Works alleges that John Doe uploaded its videos to various BitTorrent sites and forums, but John Doe vigorously denies the accusations, noting that the ‘evidence’ presented by Flava Works fails to back up its claims.

“The materials do not reveal or expose infringement of any sort. [Flava Works’] real purpose in sending this ‘proof’ was to demonstrate just how humiliating it would be to defend against Flava Works’ scurrilous charges,” John Doe’s lawsuit notes.

“[Flava Works’] materials consist largely of screen shots of extremely graphic images of pornography, which [Flava Works] implies that [John Doe] has viewed — but which are completely irrelevant given that they are not Flava Works content. Nevertheless, Bleicher assured [John Doe] that these materials would all be included in a publicly filed lawsuit if he refused to accede to [Flava Works’] payment demands.”

From his lawsuit (pdf) it’s clear that John Doe is in no mood to pay Flava Works large sums of cash and he’s aggressively on the attack, describing the company’s demands as “criminal extortion.”

He concludes with a request for a declaration that he has not infringed Flava Works’ copyrights, while demanding attorneys’ fees and further relief to be determined by the court.

The big question now is whether Flava Works will follow through with its threats to exposure the entertainer, or whether it will drift back into the shadows to fight another day. Definitely one to watch.

Update: Flava Works has now followed through on its threat to sue Juris. A complaint filed iat an Illinois court accuses the TV executive of uploading Flava Works titles to several gay-focused torrent sites in breach of copyright. It demands $1.2m in damages.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

AWS Online Tech Talks – August 2017

Post Syndicated from Sara Rodas original https://aws.amazon.com/blogs/aws/aws-online-tech-talks-august-2017/

Welcome to mid-August, everyone–the season of beach days, family road trips, and an inbox full of “out of office” emails from your coworkers. Just in case spending time indoors has you feeling a bit blue, we’ve got a piping hot batch of AWS Online Tech Talks for you to check out. Kick up your feet, grab a glass of ice cold lemonade, and dive into our latest Tech Talks on Compute and DevOps.

August 2017 – Schedule

Noted below are the upcoming scheduled live, online technical sessions being held during the month of August. Make sure to register ahead of time so you won’t miss out on these free talks conducted by AWS subject matter experts.

Webinars featured this month are:

Thursday, August 17 – Compute

9:00 – 9:40 AM PDT: Deep Dive on [email protected].

Monday, August 28 – DevOps

10:30 – 11:10 AM PDT: Building a Python Serverless Applications with AWS Chalice.

12:00 – 12:40 PM PDT: How to Deploy .NET Code to AWS from Within Visual Studio.

The AWS Online Tech Talks series covers a broad range of topics at varying technical levels. These sessions feature live demonstrations & customer examples led by AWS engineers and Solution Architects. Check out the AWS YouTube channel for more on-demand webinars on AWS technologies.

– Sara (Hello everyone, I’m a co-op from Northeastern University joining the team until December.)