Tag Archives: University

UK Government Teaches 7-Year-Olds That Piracy is Stealing

Post Syndicated from Ernesto original https://torrentfreak.com/uk-government-teaches-7-year-olds-that-piracy-is-stealing-180118/

In 2014, Mike Weatherley, the UK Government’s top IP advisor at the time, offered a recommendation that copyright education should be added to the school curriculum, starting with the youngest kids in primary school.

New generations should learn copyright moral and ethics, the idea was, and a few months later the first version of the new “Cracking Ideas” curriculum was made public.

In the years that followed new course material was added, published by the UK’s Intellectual Property Office (IPO) with support from the local copyright industry. The teaching material is aimed at a variety of ages, including those who have just started primary school.

Part of the education features a fictitious cartoon band called Nancy and the Meerkats. With help from their manager, they learn key copyright insights and this week several new videos were published, BBC points out.

The videos try to explain concepts including copyright, trademarks, and how people can protect the things they’ve created. Interestingly, the videos themselves use names of existing musicians, with puns such as Ed Shealing, Justin Beaver, and the evil Kitty Perry. Even Nancy and the Meerkats appears to be a play on the classic 1970s cartoon series Josie and the Pussycats, featuring a pop band of the same name.

The play on Ed Sheeran’s name is interesting, to say the least. While he’s one of the most popular artists today, he also mentioned in the past that file-sharing made his career.

“…illegal fire sharing was what made me. It was students in England going to university, sharing my songs with each other,” Sheeran said in an interview with CBS last year.

But that didn’t stop the IPO from using his likeness for their anti-file-sharing campaign. According to Catherine Davies of IPO’s education outreach department, knowledge about key intellectual property issues is a “life skill” nowadays.

“In today’s digital environment, even very young people are IP consumers, accessing online digital content independently and regularly,” she tells the BBC. “A basic understanding of IP and a respect for others’ IP rights is therefore a key life skill.”

While we doubt that these concepts will appeal to the average five-year-old, the course material does it best to simplify complex copyright issues. Perhaps that’s also where the danger lies.

The program is in part backed by copyright-reliant industries, who have a different view on the matter than many others. For example, a previously published video of Nancy and the Meerkats deals with the topic of file-sharing.

After the Meerkats found out that people were downloading their tracks from pirate sites and became outraged, their manager Big Joe explained that file-sharing is just the same as stealing a CD from a physical store.

“In a way, all those people who downloaded free copies are doing the same thing as walking out of the shop with a CD and forgetting to go the till,” he says.

“What these sites are doing is sometimes called piracy. It not only affects music but also videos, books, and movies.If someone owns the copyright to something, well, it is stealing. Simple as that,” Big Joe adds.

The Pirates of the Internet!

While we won’t go into the copying vs. stealing debate, it’s interesting that there is no mention of more liberal copyright licenses. There are thousands of artists who freely share their work after all, by adopting Creative Commons licenses for example. Downloading these tracks is certainly not stealing.

Jim Killock, director of the Open Rights Group, notes that the campaign is a bit extreme at points.

“Infringing copyright is a bad thing, but it is not the same as physical theft. Many children will guess that making a copy is not the same as making off with the local store’s chocolate bars,” he says.

“Children aren’t born bureaucrats, and they are surrounded by stupid rules made by stupid adults. Presumably, the IPO doesn’t want children to conclude that copyright is just another one, so they should be a bit more careful with how they explain things.”

Killock also stresses that children copy a lot of things in school, which would normally violate copyright. However, thanks to the educational exceptions they’re not getting in trouble. The IPO could pay more attention to these going forward.

Perhaps Nancy and the Meerkats could decide to release a free to share track in a future episode, for example, and encourage kids to use it for their own remixes, or other creative projects. Creativity and copyright are not all about restrictions, after all.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

US Govt Brands Torrent, Streaming & Cyberlocker Sites As Notorious Markets

Post Syndicated from Andy original https://torrentfreak.com/us-govt-brands-torrent-streaming-cyberlocker-sites-as-notorious-markets-180115/

In its annual “Out-of-Cycle Review of Notorious Markets” the office of the United States Trade Representative (USTR) has listed a long list of websites said to be involved in online piracy.

The list is compiled with high-level input from various trade groups, including the MPAA and RIAA who both submitted their recommendations (1,2) during early October last year.

With the word “allegedly” used more than two dozen times in the report, the US government notes that its report does not constitute cast-iron proof of illegal activity. However, it urges the countries from where the so-called “notorious markets” operate to take action where they can, while putting owners and facilitators on notice that their activities are under the spotlight.

“A goal of the List is to motivate appropriate action by owners, operators, and service providers in the private sector of these and similar markets, as well as governments, to reduce piracy and counterfeiting,” the report reads.

“USTR highlights the following marketplaces because they exemplify global counterfeiting and piracy concerns and because the scale of infringing activity in these marketplaces can cause significant harm to U.S. intellectual property (IP) owners, consumers, legitimate online platforms, and the economy.”

The report begins with a page titled “Issue Focus: Illicit Streaming Devices”. Unsurprisingly, particularly given their place in dozens of headlines last year, the segment focus on the set-top box phenomenon. The piece doesn’t list any apps or software tools as such but highlights the general position, claiming a cost to the US entertainment industry of $4-5 billion a year.

Torrent Sites

In common with previous years, the USTR goes on to list several of the world’s top torrent sites but due to changes in circumstances, others have been delisted. ExtraTorrent, which shut down May 2017, is one such example.

As the world’s most famous torrent site, The Pirate Bay gets a prominent mention, with the USTR noting that the site is of “symbolic importance as one of the longest-running and most vocal torrent sites. The USTR underlines the site’s resilience by noting its hydra-like form while revealing an apparent secret concerning its hosting arrangements.

“The Pirate Bay has allegedly had more than a dozen domains hosted in various countries around the world, applies a reverse proxy service, and uses a hosting provider in Vietnam to evade further enforcement action,” the USTR notes.

Other torrent sites singled out for criticism include RARBG, which was nominated for the listing by the movie industry. According to the USTR, the site is hosted in Bosnia and Herzegovina and has changed hosting services to prevent shutdowns in recent years.

1337x.to and the meta-search engine Torrentz2 are also given a prime mention, with the USTR noting that they are “two of the most popular torrent sites that allegedly infringe U.S. content industry’s copyrights.” Russia’s RuTracker is also targeted for criticism, with the government noting that it’s now one of the most popular torrent sites in the world.

Streaming & Cyberlockers

While torrent sites are still important, the USTR reserves considerable space in its report for streaming portals and cyberlocker-type services.

4Shared.com, a file-hosting site that has been targeted by dozens of millions of copyright notices, is reportedly no longer able to use major US payment providers. Nevertheless, the British Virgin Islands company still collects significant sums from premium accounts, advertising, and offshore payment processors, USTR notes.

Cyberlocker Rapidgator gets another prominent mention in 2017, with the USTR noting that the Russian-hosted platform generates millions of dollars every year through premium memberships while employing rewards and affiliate schemes.

Due to its increasing popularity as a hosting and streaming operation, Openload.co (Romania) is now a big target for the USTR. “The site is used frequently in combination with add-ons in illicit streaming devices. In November 2017, users visited Openload.co a staggering 270 million times,” the USTR writes.

Owned by a Swiss company and hosted in the Netherlands, the popular site Uploaded is also criticized by the US alongside France’s 1Fichier.com, which allegedly hosts pirate games while being largely unresponsive to takedown notices. Dopefile.pk, a Pakistan-based storage outfit, is also highlighted.

On the video streaming front, it’s perhaps no surprise that the USTR focuses on sites like FMovies (Sweden), GoStream (Vietnam), Movie4K.tv (Russia) and PrimeWire. An organization collectively known as the MovShare group which encompasses Nowvideo.sx, WholeCloud.net, NowDownload.cd, MeWatchSeries.to and WatchSeries.ac, among others, is also listed.

Unauthorized music / research papers

While most of the above are either focused on video or feature it as part of their repertoire, other sites are listed for their attention to music. Convert2MP3.net is named as one of the most popular stream-ripping sites in the world and is highlighted due to the prevalence of YouTube-downloader sites and the 2017 demise of YouTube-MP3.

“Convert2MP3.net does not appear to have permission from YouTube or other sites and does not have permission from right holders for a wide variety of music represented by major U.S. labels,” the USTR notes.

Given the amount of attention the site has received in 2017 as ‘The Pirate Bay of Research’, Libgen.io and Sci-Hub.io (not to mention the endless proxy and mirror sites that facilitate access) are given a detailed mention in this year’s report.

“Together these sites make it possible to download — all without permission and without remunerating authors, publishers or researchers — millions of copyrighted books by commercial publishers and university presses; scientific, technical and medical journal articles; and publications of technological standards,” the USTR writes.

Service providers

But it’s not only sites that are being put under pressure. Following a growing list of nominations in previous years, Swiss service provider Private Layer is again singled out as a rogue player in the market for hosting 1337x.to and Torrentz2.eu, among others.

“While the exact configuration of websites changes from year to year, this is the fourth consecutive year that the List has stressed the significant international trade impact of Private Layer’s hosting services and the allegedly infringing sites it hosts,” the USTR notes.

“Other listed and nominated sites may also be hosted by Private Layer but are using
reverse proxy services to obfuscate the true host from the public and from law enforcement.”

The USTR notes Switzerland’s efforts to close a legal loophole that restricts enforcement and looks forward to a positive outcome when the draft amendment is considered by parliament.

Perhaps a little surprisingly given its recent anti-piracy efforts and overtures to the US, Russia’s leading social network VK.com again gets a place on the new list. The USTR recognizes VK’s efforts but insists that more needs to be done.

Social networking and e-commerce

“In 2016, VK reached licensing agreements with major record companies, took steps to limit third-party applications dedicated to downloading infringing content from the site, and experimented with content recognition technologies,” the USTR writes.

“Despite these positive signals, VK reportedly continues to be a hub of infringing activity and the U.S. motion picture industry reports that they find thousands of infringing files on the site each month.”

Finally, in addition to traditional pirate sites, the US also lists online marketplaces that allegedly fail to meet appropriate standards. Re-added to the list in 2016 after a brief hiatus in 2015, China’s Alibaba is listed again in 2017. The development provoked an angry response from the company.

Describing his company as a “scapegoat”, Alibaba Group President Michael Evans said that his platform had achieved a 25% drop in takedown requests and has even been removing infringing listings before they make it online.

“In light of all this, it’s clear that no matter how much action we take and progress we make, the USTR is not actually interested in seeing tangible results,” Evans said in a statement.

The full list of sites in the Notorious Markets Report 2017 (pdf) can be found below.

– 1fichier.com – (cyberlocker)
– 4shared.com – (cyberlocker)
– convert2mp3.net – (stream-ripper)
– Dhgate.com (e-commerce)
– Dopefile.pl – (cyberlocker)
– Firestorm-servers.com (pirate gaming service)
– Fmovies.is, Fmovies.se, Fmovies.to – (streaming)
– Gostream.is, Gomovies.to, 123movieshd.to (streaming)
– Indiamart.com (e-commerce)
– Kinogo.club, kinogo.co (streaming host, platform)
– Libgen.io, sci-hub.io, libgen.pw, sci-hub.cc, sci-hub.bz, libgen.info, lib.rus.ec, bookfi.org, bookzz.org, booker.org, booksc.org, book4you.org, bookos-z1.org, booksee.org, b-ok.org (research downloads)
– Movshare Group – Nowvideo.sx, wholecloud.net, auroravid.to, bitvid.sx, nowdownload.ch, cloudtime.to, mewatchseries.to, watchseries.ac (streaming)
– Movie4k.tv (streaming)
– MP3VA.com (music)
– Openload.co (cyberlocker / streaming)
– 1337x.to (torrent site)
– Primewire.ag (streaming)
– Torrentz2, Torrentz2.me, Torrentz2.is (torrent site)
– Rarbg.to (torrent site)
– Rebel (domain company)
– Repelis.tv (movie and TV linking)
– RuTracker.org (torrent site)
– Rapidgator.net (cyberlocker)
– Taobao.com (e-commerce)
– The Pirate Bay (torrent site)
– TVPlus, TVBrowser, Kuaikan (streaming apps and addons, China)
– Uploaded.net (cyberlocker)
– VK.com (social networking)

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

Musician’s White Noise YouTube Video Hit With Copyright Complaints

Post Syndicated from Andy original https://torrentfreak.com/musicians-white-noise-youtube-video-hit-with-copyright-complaints-180105/

When people upload original content to YouTube, there should be no problem with getting paid for that content, should it attract enough interest from the public.

Those who upload infringing content get a much less easy ride, with their uploads getting flagged for abuse, potentially putting their accounts at risk.

That’s what’s happened to Australia-based music technologist Sebastian Tomczak, who uploaded a completely non-infringing work to YouTube and now faces five separate copyright complaints.

“I teach and work in a music department at a University here in Australia. I’ve got a PhD in chiptune, and my main research interests are various intersections of music / sound / tech e.g. arduino programming and DIY stuff, modular synthesis, digital production, sound design for games, etc,” Tomczak informs TF.

“I started blogging about music around a decade ago or so, mainly to write about stuff I was interested in, researching or doing. At the time this would have been physical interaction, music controller design, sound design and composition involving computers.”

One of Tomczak videos was a masterpiece entitled “10 Hours of Low Level White Noise” which features – wait for it – ten hours of low-level white noise.

“The white noise video was part of a number of videos I put online at the time. I was interested in listening to continuous sounds of various types, and how our perception of these kinds of sounds and our attention changes over longer periods – e.g. distracted, focused, sleeping, waking, working etc,” Tomczak says.

White noise is the sound created when all different frequencies are combined together into a kind of audio mush that’s a little baffling and yet soothing in the right circumstances. Some people use it to fall asleep a little easier, others to distract their attention away from irritating sounds in the environment, like an aircon system or fan, for example.

The white noise made by Tomczak and presented in his video was all his own work.

“I ‘created’ and uploaded the video in question. The video was created by generating a noise waveform of 10 hours length using the freeware software Audacity and the built-in noise generator. The resulting 10-hour audio file was then imported into ScreenFlow, where the text was added and then rendered as one 10-hour video file,” he explains.

This morning, however, Tomczak received a complaint from YouTube after a copyright holder claimed that it had the rights to his composition. When he checked his YouTube account, yet more complaints greeted him. In fact, since July 2015, when the video was first uploaded, a total of five copyright complaints had been filed against Tomczak’s composition.

As seen from the image below, posted by Tomczak to his Twitter account, the five complaints came from four copyright holders, with one feeling the need to file two separate complaints while citing two different works.

The complaints against Tomczak’s white noise

One company involved – Catapult Distribution – say that Tomczak’s composition infringes on the copyrights of “White Noise Sleep Therapy”, a client selling the title “Majestic Ocean Waves”. It also manages to do the same for the company’s “Soothing Baby Sleep” title. The other complaints come from Merlin Symphonic Distribution and Dig Dis for similar works .

Under normal circumstances, Tomczak’s account could have been disabled by YouTube for so many infringements but in all cases the copyright holders chose to monetize the musician’s ‘infringement’ instead, via the site’s ContentID system. In other words, after creating the video himself with his own efforts, copyright holders are now taking all the revenue. It’s a situation that Tomczak will now dispute with YouTube.

“I’ve had quite a few copyright claims against me, usually based on cases where I’ve made long mixes of work, or longer pieces. Usually I don’t take them too seriously,” he explains.

“In any of the cases where I think a given claim would be an issue, I would dispute it by saying I could either prove that I have made the work, have the original materials that generated the work, or could show enough of the components included in the work to prove originality. This has always been successful for me and I hope it will be in this case as well.”

Sadly, this isn’t the only problem Tomczak’s had with YouTube’s copyright complaints system. A while back the musician was asked to take part in a video for his workplace but things didn’t go well.

“I was asked to participate in a video for my workplace and the production team asked if they could use my music and I said ‘no problem’. A month later, the video was uploaded to one of our work channels, and then YouTube generated a copyright claim against me for my own music from the work channel,” he reveals.

Tomczak says that to him, automated copyright claims are largely an annoyance and if he was making enough money from YouTube, the system would be detrimental in the long run. He feels it’s something that YouTube should adjust, to ensure that false claims aren’t filed against uploads like his.

While he tries to sort out this mess with YouTube, there is some good news. Other videos of his including “10 Hours of a Perfect Fifth“, “The First 106 Fifths Derived from a 3/2 Ratio” and “Hour-Long Octave Shift” all remain copyright-complaint free.

For now……

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

WebTorrent Desktop Hits a Million Downloads

Post Syndicated from Ernesto original https://torrentfreak.com/webtorrent-desktop-hits-a-million-downloads-180104/

Fifteen years ago BitTorrent conquered the masses. It offered a superior way to share large video files, something that was virtually impossible at the time.

With the shift to online video streaming, BitTorrent has lost prominence in recent years. That’s a shame, since the technology offers many advantages.

This is one of the reasons why Stanford University graduate Feross Aboukhadijeh invented WebTorrent. The technology, which is supported by most modern browsers, allows users to seamlessly stream videos on the web with BitTorrent.

In the few years that it’s been around, several tools and services have been built on WebTorrent, including a dedicated desktop client. The desktop version basically serves as a torrent client that streams torrents almost instantaneously on Windows, Linux, and Mac.

Add in AirPlay, Chromecast and DLNA support and it brings these videos to any network-connected TV as well. Quite a powerful tool, as many people have discovered in recent months.

This week Feross informed TorrentFreak that WebTorrent Desktop had reached the one million download mark. That’s a major milestone for a modest project with no full-time developer. But while users seem to be happy, it’s not perfect yet.

“WebTorrent Desktop is the best torrent app in existence. Yet, the app suffers from performance issues when too many torrents are added or too many peers show up. It’s also missing important power user features like bandwidth throttling,” Feross says.

The same is true for WebTorrent itself, which the desktop version is built on. The software has been on the verge of version 1.0.0 for over two years now but needs some more work to make the final leap. This is why Feross would like to invest more time into the projects, given the right support.

Last month Feross launched a Patreon campaign to crowdfund future development of WebTorrent including the desktop version. There are dozens of open issues and a lot of plans and with proper funding, the developer can free up time to work on these.

“The goal of the campaign is to allow me to spend a few days per week addressing these issues,” Feross says, adding that all software he works on is completely free and always has been.

Feross and cat

Thus far the fundraising campaign is going well. WebTorrent’s developer has received support from dozens of people, totaling $1,730 a month through Patreon alone, and he has signed up the privacy oriented browser Brave and video site PopChest as Platinum backers.

Community-driven funding is a great way to support Open Source projects, Feross believes, and he is encouraging others to try it out as well.

“I’ve been promoting Patreon heavily within my community as a way for open source software developers to get paid for their work,” Feross says.

“The norm in the industry right now is that no one gets paid — it’s all volunteer work, even though we’re generating a lot of value for the world! Patreon is a really promising solution for software people like me.”

People who want to give WebTorrent Desktop a try can download a copy from the official site. More information on the core WebTorrent technology and its implementations is available there was well. And if you like what you see, Feross still needs a bit of help to reach his Patreon goal.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

Filmmakers Want The Right to Break DRM and Rip Blu-Rays

Post Syndicated from Ernesto original https://torrentfreak.com/filmmakers-want-the-right-to-break-drm-and-rip-blu-rays-171228/

The major movie studios are doing everything in their power to stop the public from copying films.

While nearly every movie and TV-show leaks on the Internet, these companies still see DRM as a vital tool to prevent piracy from spiraling out of control.

Technically speaking it’s not hard to rip a DVD or Blu-Ray disc nowadays, and the same is true for ripping content from Netflix or YouTube. However, people who do this are breaking the law.

The DMCA’s anti-circumvention provisions specifically forbid it. There are some exemptions, for educational use for example, and to allow for other types of fair use, but the line between legal and illegal is not always clear.

Interestingly, filmmakers are not happy with the current law either. They often want to use small pieces of other videos in their films, but under the current exemptions, this is only permitted for documentaries.

The International Documentary Association, Kartemquin Films, Independent Filmmaker Project, University of Film and Video Association and several other organizations hope this will change.

In a comment to the Copyright Office, which is currently considering updates to the exemptions, they argue that all filmmakers should be allowed by break DRM and rip Blu-Rays.

According to the filmmakers, the documentary genre is vaguely defined. This leads to a lot of confusion whether or not the exemptions apply. They, therefore, suggest to apply it to all filmmakers, instead of criminalizing those who don’t identify themselves as documentarians.

“Since 2010, exemptions applicable to documentary filmmaking have been in effect. This exemption has helped many filmmakers, and there has been neither evidence nor any allegation that this exemption has harmed rightsholders in any way.

“There is no reason this would change if the ‘documentary’ limitation were removed. All filmmakers regularly need access to footage on DVDs and without an exemption to DVDs, many non-infringing uses simply cannot be made,” the groups add.

The submission includes letters from several filmmakers who explain why an exemption would be crucial to them.

Filmmakers Steve Boettcher and Mike Trinklein explain that they refrained from making a film how they wanted it to be, fearing legal trouble. Their film included a lot of drama elements and was not a typical documentary.

“Given the significant amount of drama in the film [we are working on], we decided early on that our storytelling toolbox could not include fair use of materials from DVD or Blu-ray, because the exemption did not cover accessing that material for use in a drama,” they write

“Already, we were hindered in our ability to tell these stories. So, there is already a chilling effect in that a drama-heavy documentary might be seen as a drama outright, and thus under a different set of rules.”

Another filmmaker, who wants to remain anonymous, plans on making a hybrid documentary/narrative feature about a famous film duo. Without ripping the clips he needs, this movie is never going to be made.

“I am unsure of whether my project would fall under the exemption because it is a combination of documentary and narrative, and my fear of a lawsuit once my project is publicly viewed and distributed stops me from ripping from these sources.”

These are just two of many examples where filmmakers show that they need to break DRM and rip content to make the work they want.

The MPAA and others have previously argued that these changes are not required. Instead, they pointed out that people could point their cameras or phones at the screen to record something, or use screen capture software.

However, these are not viable alternatives according to the filmmakers, as the quality is inferior. They, therefore, call on the Copyright Office to expand the exemption to cover all films and filmmakers.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

Power data ingestion into Splunk using Amazon Kinesis Data Firehose

Post Syndicated from Tarik Makota original https://aws.amazon.com/blogs/big-data/power-data-ingestion-into-splunk-using-amazon-kinesis-data-firehose/

In late September, during the annual Splunk .conf, Splunk and Amazon Web Services (AWS) jointly announced that Amazon Kinesis Data Firehose now supports Splunk Enterprise and Splunk Cloud as a delivery destination. This native integration between Splunk Enterprise, Splunk Cloud, and Amazon Kinesis Data Firehose is designed to make AWS data ingestion setup seamless, while offering a secure and fault-tolerant delivery mechanism. We want to enable customers to monitor and analyze machine data from any source and use it to deliver operational intelligence and optimize IT, security, and business performance.

With Kinesis Data Firehose, customers can use a fully managed, reliable, and scalable data streaming solution to Splunk. In this post, we tell you a bit more about the Kinesis Data Firehose and Splunk integration. We also show you how to ingest large amounts of data into Splunk using Kinesis Data Firehose.

Push vs. Pull data ingestion

Presently, customers use a combination of two ingestion patterns, primarily based on data source and volume, in addition to existing company infrastructure and expertise:

  1. Pull-based approach: Using dedicated pollers running the popular Splunk Add-on for AWS to pull data from various AWS services such as Amazon CloudWatch or Amazon S3.
  2. Push-based approach: Streaming data directly from AWS to Splunk HTTP Event Collector (HEC) by using AWS Lambda. Examples of applicable data sources include CloudWatch Logs and Amazon Kinesis Data Streams.

The pull-based approach offers data delivery guarantees such as retries and checkpointing out of the box. However, it requires more ops to manage and orchestrate the dedicated pollers, which are commonly running on Amazon EC2 instances. With this setup, you pay for the infrastructure even when it’s idle.

On the other hand, the push-based approach offers a low-latency scalable data pipeline made up of serverless resources like AWS Lambda sending directly to Splunk indexers (by using Splunk HEC). This approach translates into lower operational complexity and cost. However, if you need guaranteed data delivery then you have to design your solution to handle issues such as a Splunk connection failure or Lambda execution failure. To do so, you might use, for example, AWS Lambda Dead Letter Queues.

How about getting the best of both worlds?

Let’s go over the new integration’s end-to-end solution and examine how Kinesis Data Firehose and Splunk together expand the push-based approach into a native AWS solution for applicable data sources.

By using a managed service like Kinesis Data Firehose for data ingestion into Splunk, we provide out-of-the-box reliability and scalability. One of the pain points of the old approach was the overhead of managing the data collection nodes (Splunk heavy forwarders). With the new Kinesis Data Firehose to Splunk integration, there are no forwarders to manage or set up. Data producers (1) are configured through the AWS Management Console to drop data into Kinesis Data Firehose.

You can also create your own data producers. For example, you can drop data into a Firehose delivery stream by using Amazon Kinesis Agent, or by using the Firehose API (PutRecord(), PutRecordBatch()), or by writing to a Kinesis Data Stream configured to be the data source of a Firehose delivery stream. For more details, refer to Sending Data to an Amazon Kinesis Data Firehose Delivery Stream.

You might need to transform the data before it goes into Splunk for analysis. For example, you might want to enrich it or filter or anonymize sensitive data. You can do so using AWS Lambda. In this scenario, Kinesis Data Firehose buffers data from the incoming source data, sends it to the specified Lambda function (2), and then rebuffers the transformed data to the Splunk Cluster. Kinesis Data Firehose provides the Lambda blueprints that you can use to create a Lambda function for data transformation.

Systems fail all the time. Let’s see how this integration handles outside failures to guarantee data durability. In cases when Kinesis Data Firehose can’t deliver data to the Splunk Cluster, data is automatically backed up to an S3 bucket. You can configure this feature while creating the Firehose delivery stream (3). You can choose to back up all data or only the data that’s failed during delivery to Splunk.

In addition to using S3 for data backup, this Firehose integration with Splunk supports Splunk Indexer Acknowledgments to guarantee event delivery. This feature is configured on Splunk’s HTTP Event Collector (HEC) (4). It ensures that HEC returns an acknowledgment to Kinesis Data Firehose only after data has been indexed and is available in the Splunk cluster (5).

Now let’s look at a hands-on exercise that shows how to forward VPC flow logs to Splunk.

How-to guide

To process VPC flow logs, we implement the following architecture.

Amazon Virtual Private Cloud (Amazon VPC) delivers flow log files into an Amazon CloudWatch Logs group. Using a CloudWatch Logs subscription filter, we set up real-time delivery of CloudWatch Logs to an Kinesis Data Firehose stream.

Data coming from CloudWatch Logs is compressed with gzip compression. To work with this compression, we need to configure a Lambda-based data transformation in Kinesis Data Firehose to decompress the data and deposit it back into the stream. Firehose then delivers the raw logs to the Splunk Http Event Collector (HEC).

If delivery to the Splunk HEC fails, Firehose deposits the logs into an Amazon S3 bucket. You can then ingest the events from S3 using an alternate mechanism such as a Lambda function.

When data reaches Splunk (Enterprise or Cloud), Splunk parsing configurations (packaged in the Splunk Add-on for Kinesis Data Firehose) extract and parse all fields. They make data ready for querying and visualization using Splunk Enterprise and Splunk Cloud.

Walkthrough

Install the Splunk Add-on for Amazon Kinesis Data Firehose

The Splunk Add-on for Amazon Kinesis Data Firehose enables Splunk (be it Splunk Enterprise, Splunk App for AWS, or Splunk Enterprise Security) to use data ingested from Amazon Kinesis Data Firehose. Install the Add-on on all the indexers with an HTTP Event Collector (HEC). The Add-on is available for download from Splunkbase.

HTTP Event Collector (HEC)

Before you can use Kinesis Data Firehose to deliver data to Splunk, set up the Splunk HEC to receive the data. From Splunk web, go to the Setting menu, choose Data Inputs, and choose HTTP Event Collector. Choose Global Settings, ensure All tokens is enabled, and then choose Save. Then choose New Token to create a new HEC endpoint and token. When you create a new token, make sure that Enable indexer acknowledgment is checked.

When prompted to select a source type, select aws:cloudwatch:vpcflow.

Create an S3 backsplash bucket

To provide for situations in which Kinesis Data Firehose can’t deliver data to the Splunk Cluster, we use an S3 bucket to back up the data. You can configure this feature to back up all data or only the data that’s failed during delivery to Splunk.

Note: Bucket names are unique. Thus, you can’t use tmak-backsplash-bucket.

aws s3 create-bucket --bucket tmak-backsplash-bucket --create-bucket-configuration LocationConstraint=ap-northeast-1

Create an IAM role for the Lambda transform function

Firehose triggers an AWS Lambda function that transforms the data in the delivery stream. Let’s first create a role for the Lambda function called LambdaBasicRole.

Note: You can also set this role up when creating your Lambda function.

$ aws iam create-role --role-name LambdaBasicRole --assume-role-policy-document file://TrustPolicyForLambda.json

Here is TrustPolicyForLambda.json.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

 

After the role is created, attach the managed Lambda basic execution policy to it.

$ aws iam attach-role-policy 
  --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole 
  --role-name LambdaBasicRole

 

Create a Firehose Stream

On the AWS console, open the Amazon Kinesis service, go to the Firehose console, and choose Create Delivery Stream.

In the next section, you can specify whether you want to use an inline Lambda function for transformation. Because incoming CloudWatch Logs are gzip compressed, choose Enabled for Record transformation, and then choose Create new.

From the list of the available blueprint functions, choose Kinesis Data Firehose CloudWatch Logs Processor. This function unzips data and place it back into the Firehose stream in compliance with the record transformation output model.

Enter a name for the Lambda function, choose Choose an existing role, and then choose the role you created earlier. Then choose Create Function.

Go back to the Firehose Stream wizard, choose the Lambda function you just created, and then choose Next.

Select Splunk as the destination, and enter your Splunk Http Event Collector information.

Note: Amazon Kinesis Data Firehose requires the Splunk HTTP Event Collector (HEC) endpoint to be terminated with a valid CA-signed certificate matching the DNS hostname used to connect to your HEC endpoint. You receive delivery errors if you are using a self-signed certificate.

In this example, we only back up logs that fail during delivery.

To monitor your Firehose delivery stream, enable error logging. Doing this means that you can monitor record delivery errors.

Create an IAM role for the Firehose stream by choosing Create new, or Choose. Doing this brings you to a new screen. Choose Create a new IAM role, give the role a name, and then choose Allow.

If you look at the policy document, you can see that the role gives Kinesis Data Firehose permission to publish error logs to CloudWatch, execute your Lambda function, and put records into your S3 backup bucket.

You now get a chance to review and adjust the Firehose stream settings. When you are satisfied, choose Create Stream. You get a confirmation once the stream is created and active.

Create a VPC Flow Log

To send events from Amazon VPC, you need to set up a VPC flow log. If you already have a VPC flow log you want to use, you can skip to the “Publish CloudWatch to Kinesis Data Firehose” section.

On the AWS console, open the Amazon VPC service. Then choose VPC, Your VPC, and choose the VPC you want to send flow logs from. Choose Flow Logs, and then choose Create Flow Log. If you don’t have an IAM role that allows your VPC to publish logs to CloudWatch, choose Set Up Permissions and Create new role. Use the defaults when presented with the screen to create the new IAM role.

Once active, your VPC flow log should look like the following.

Publish CloudWatch to Kinesis Data Firehose

When you generate traffic to or from your VPC, the log group is created in Amazon CloudWatch. The new log group has no subscription filter, so set up a subscription filter. Setting this up establishes a real-time data feed from the log group to your Firehose delivery stream.

At present, you have to use the AWS Command Line Interface (AWS CLI) to create a CloudWatch Logs subscription to a Kinesis Data Firehose stream. However, you can use the AWS console to create subscriptions to Lambda and Amazon Elasticsearch Service.

To allow CloudWatch to publish to your Firehose stream, you need to give it permissions.

$ aws iam create-role --role-name CWLtoKinesisFirehoseRole --assume-role-policy-document file://TrustPolicyForCWLToFireHose.json


Here is the content for TrustPolicyForCWLToFireHose.json.

{
  "Statement": {
    "Effect": "Allow",
    "Principal": { "Service": "logs.us-east-1.amazonaws.com" },
    "Action": "sts:AssumeRole"
  }
}

 

Attach the policy to the newly created role.

$ aws iam put-role-policy 
    --role-name CWLtoKinesisFirehoseRole 
    --policy-name Permissions-Policy-For-CWL 
    --policy-document file://PermissionPolicyForCWLToFireHose.json

Here is the content for PermissionPolicyForCWLToFireHose.json.

{
    "Statement":[
      {
        "Effect":"Allow",
        "Action":["firehose:*"],
        "Resource":["arn:aws:firehose:us-east-1:YOUR-AWS-ACCT-NUM:deliverystream/ FirehoseSplunkDeliveryStream"]
      },
      {
        "Effect":"Allow",
        "Action":["iam:PassRole"],
        "Resource":["arn:aws:iam::YOUR-AWS-ACCT-NUM:role/CWLtoKinesisFirehoseRole"]
      }
    ]
}

Finally, create a subscription filter.

$ aws logs put-subscription-filter 
   --log-group-name " /vpc/flowlog/FirehoseSplunkDemo" 
   --filter-name "Destination" 
   --filter-pattern "" 
   --destination-arn "arn:aws:firehose:us-east-1:YOUR-AWS-ACCT-NUM:deliverystream/FirehoseSplunkDeliveryStream" 
   --role-arn "arn:aws:iam::YOUR-AWS-ACCT-NUM:role/CWLtoKinesisFirehoseRole"

When you run the AWS CLI command preceding, you don’t get any acknowledgment. To validate that your CloudWatch Log Group is subscribed to your Firehose stream, check the CloudWatch console.

As soon as the subscription filter is created, the real-time log data from the log group goes into your Firehose delivery stream. Your stream then delivers it to your Splunk Enterprise or Splunk Cloud environment for querying and visualization. The screenshot following is from Splunk Enterprise.

In addition, you can monitor and view metrics associated with your delivery stream using the AWS console.

Conclusion

Although our walkthrough uses VPC Flow Logs, the pattern can be used in many other scenarios. These include ingesting data from AWS IoT, other CloudWatch logs and events, Kinesis Streams or other data sources using the Kinesis Agent or Kinesis Producer Library. We also used Lambda blueprint Kinesis Data Firehose CloudWatch Logs Processor to transform streaming records from Kinesis Data Firehose. However, you might need to use a different Lambda blueprint or disable record transformation entirely depending on your use case. For an additional use case using Kinesis Data Firehose, check out This is My Architecture Video, which discusses how to securely centralize cross-account data analytics using Kinesis and Splunk.

 


Additional Reading

If you found this post useful, be sure to check out Integrating Splunk with Amazon Kinesis Streams and Using Amazon EMR and Hunk for Rapid Response Log Analysis and Review.


About the Authors

Tarik Makota is a solutions architect with the Amazon Web Services Partner Network. He provides technical guidance, design advice and thought leadership to AWS’ most strategic software partners. His career includes work in an extremely broad software development and architecture roles across ERP, financial printing, benefit delivery and administration and financial services. He holds an M.S. in Software Development and Management from Rochester Institute of Technology.

 

 

 

Roy Arsan is a solutions architect in the Splunk Partner Integrations team. He has a background in product development, cloud architecture, and building consumer and enterprise cloud applications. More recently, he has architected Splunk solutions on major cloud providers, including an AWS Quick Start for Splunk that enables AWS users to easily deploy distributed Splunk Enterprise straight from their AWS console. He’s also the co-author of the AWS Lambda blueprints for Splunk. He holds an M.S. in Computer Science Engineering from the University of Michigan.

 

 

 

Lorelei Joins The Operations Crew

Post Syndicated from Yev original https://www.backblaze.com/blog/lorelei-joins-operations-crew/

We’ve eclipsed the 400 Petabyte mark and our data center continues to grow. What does that mean? It means we need more great people working in our data centers making sure that the hard drives keep spinning and that sputtering drives are promptly dealt with. Lorelei is the newest Data Center Technician to join our ranks. Let’s learn a bit more about Lorelei, shall we?

What is your Backblaze Title?
DC Tech!! I’m the saucy one.

Where are you originally from?
San Francisco/Bowling Green, Ohio. Just moved up to Sacramento this year, and it’s so nice to have four seasons again. I’m drowning in leaves but I’m totally OK with it.

What attracted you to Backblaze?
I was a librarian in my previous life, mainly because I believe that information should be open to everyone. I was familiar with Backblaze prior to joining the team, and I’m a huge fan of their fresh approach to sharing information and openness. The interview process was also the coolest one I’ll ever have!

What do you expect to learn while being at Backblaze?
A lot about Linux!

Where else have you worked?
A chocolate factory and a popular culture library.

Where did you go to school?
CSU East Bay, Bowling Green State University (go Falcons), and Clarion.

Favorite place you’ve traveled?
Stockholm & Tokyo! I hope to travel more in Asia and Europe.

Favorite hobby?
Music is not magic, but music is…
Come sing with me @ karaoke!

Favorite food?
I love trying new food. I love anything that’s acidic, sweet, fresh, salty, flavorful. Fruit is the best food, but everything else is good too. I’m one of those Yelp people: always seeking & giving food recs!

Why do you like certain things?
I like things that make me happy and that make other people happy. Have fun & enjoy life. Yeeeeehaw.

Welcome to the team Lorelei. And thank you very much for leaving Yelp reviews. It’s nice to give back to the community!

The post Lorelei Joins The Operations Crew appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

University College London is Accidentally Running a Huge “Pirate” Movie Site

Post Syndicated from Andy original https://torrentfreak.com/university-college-london-is-accidentally-running-a-huge-pirate-movie-site-171216/

If someone wants to obtain the latest movies for free, all they need to do is head over to the nearest torrent or streaming portal, press a few buttons, and the content appears in a matter of seconds or minutes, dependent on choice.

Indeed, for those seeking mainstream content DRM-free, this is the only way to obtain it, since studios generally don’t make their content available in this fashion. But we know an establishment that does, on a grand scale.

University College London is the third largest university in the UK. According to accounts (pdf) published this summer, it has revenues of more than £1.32 billion. Somewhat surprisingly, this educational behemoth also has a sensational multimedia trick up its considerable sleeve.

The university’s website, located at UCL.ac.uk, is a polished affair and provides all the information anyone could need. However, until one browses to the Self-Access Centre, the full glory of the platform remains largely hidden.

Located at resources.clie.ucl.ac.uk/home/sac/english/films, it looks not unlike Netflix, or indeed any one of thousands of pirate streaming sites around today. However, it appears to be intended for university and educational use only.

UCL’s Self-Access Centre

“Welcome to the Self-Access Centre materials database. Here you can find out about the English materials we have in the SAC and explore our online materials,” the site reads.

“They were designed to help you improve your English skills. Most of the video materials, including films and documentaries, are now available to be watched online. Log on with your UCL id and password to watch them!”

According to a university video tutorial, all content on the SAC can be viewed on campus or from home, as long as a proper login and password is entered. The material is provided for educational purposes and when viewed through the portal, is accompanied by questions, notes, and various exercises.

Trouble is, the entire system is open to the wider Internet, with no logins or passwords required.

A sample of the movies on offer for direct download

The above image doesn’t even begin to scratch the surface. In one directory alone, TorrentFreak counted more than 700 English language movies. In another, more than 600 documentaries including all episodes of the BBC’s Blue Planet II. World Cinema produced close to 90 results, with hundreds of titles voiced in languages from Arabic to Japanese to Welsh.

Links can be pasted into VLC and streamed direct

Quite how long this massive trove of films and TV shows has been open to the public isn’t clear but a simple Google search reveals not only the content itself, but also links to movies and other material on sites in the Middle East and social networks in Russia.

Some of them date back to at least 2016 so it’s probably safe to assume that untold terabytes of data have already been liberated from the university’s servers for the pleasure of the public.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

The Operations Team Just Got Rich-er!

Post Syndicated from Yev original https://www.backblaze.com/blog/operations-team-just-got-rich-er/

We’re growing at a pretty rapid clip, and as we add more customers, we need people to help keep all of our hard drive spinning. Along with support, the other department that grows linearly with the number of customers that join us is the operations team, and they’ve just added a new member to their team, Rich! He joins us as a Network Systems Administrator! Lets take a moment to learn more about Rich, shall we?

What is your Backblaze Title?
Network Systems Administrator

Where are you originally from?
The Upper Peninsula of Michigan. Da UP, eh!

What attracted you to Backblaze?
The fact that it is a small tech company packed with highly intelligent people and a place where I can also be friends with my peers. I am also huge on cloud storage and backing up your past!

What do you expect to learn while being at Backblaze?
I look forward to expanding my Networking skills and System Administration skills while helping build the best Cloud Storage and Backup Company there is!

Where else have you worked?
I first started working in Data Centers at Viawest. I was previously an Infrastructure Engineer at Twitter and a Production Engineer at Groupon.

Where did you go to school?
I started at Finlandia University in Norther Michigan, carried onto Northwest Florida State and graduated with my A.S. from North Lake College in Dallas, TX. I then completed my B.S. Degree online at WGU.

What’s your dream job?
Sr. Network Engineer

Favorite place you’ve traveled?
I have traveled around a bit in my life. I really liked Dublin, Ireland but I have to say favorite has to be Puerto Vallarta, Mexico! Which is actually where I am getting married in 2019!

Favorite hobby?
Water is my life. I like to wakeboard and wakesurf. I also enjoy biking, hunting, fishing, camping, and anything that has to do with the great outdoors!

Of what achievement are you most proud?
I’m proud of moving up in my career as quickly as I have been. I am also very proud of being able to wakesurf behind a boat without a rope! Lol!

Star Trek or Star Wars?
Star Trek! I grew up on it!

Coke or Pepsi?
H2O 😀

Favorite food?
Mexican Food and Pizza!

Why do you like certain things?
Hmm…. because certain things make other certain things particularly certain!

Anything else you’d like you’d like to tell us?
Nope 😀

Who can say no to high quality H2O? Welcome to the team Rich!

The post The Operations Team Just Got Rich-er! appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Netflix Is Not Going to Kill Piracy, Research Suggests

Post Syndicated from Ernesto original https://torrentfreak.com/netflix-not-going-kill-piracy-research-suggests-171129/

There is little doubt that, in many countries, Netflix has become the standard for watching movies on the Internet.

Generally speaking, on-demand streaming services are convenient alternatives to piracy. However, millions of people stick to their old pirate habits, Netflix subscription or not.

Intrigued by this interplay of legal and unauthorized viewing, researchers from Carnegie Mellon University and Universidade Católica Portuguesa carried out an extensive study. They partnered with a major telco, which is not named, to analyze if BitTorrent downloading habits can be changed by offering legal alternatives.

The researchers used a piracy-tracking firm to get a sample of thousands of BitTorrent pirates at the associated ISP. Half of them were offered a free 45-day subscription to a premium TV and movies package, allowing them to watch popular content on demand.

To measure the effects of video-on-demand access on piracy, the researchers then monitored the legal viewing activity and BitTorrent transfers of the people who received the free offer, comparing it to a control group. The results show that piracy is harder to beat than some would expect.

Subscribers who received the free subscription watched more TV, but overall their torrenting habits didn’t change significantly.

“We find that, on average, households that received the gift increased overall TV consumption by 4.6% and reduced Internet downloads and uploads by 4.2% and 4.5%, respectively. However, and also on average, treated households did not change their likelihood of using BitTorrent during the experiment,” the researchers write.

One of the main problems was that these ‘pirates’ couldn’t get all their favorite shows and movies on the legal service, which is a common problem. For the small portion of subscribers who had access to their preferred content, the researchers did find an effect on torrent traffic.

“Households with preferences aligned with the gifted content reduced their probability of using BitTorrent during the experiment by 18% and decreased their amount of upload traffic by 45%,” the paper reads.

The video-on-demand service in the study had an average “fit” of just 12% with people’s viewing preferences, which means that they were missing a lot of content. But even Netflix, which has a library of thousands of titles, only has a fit of roughly 50%.

The researchers show that the lack of availability is partly caused by licensing windows, which makes it hard for legal video streaming services to compete with piracy.

“We show that licensing windows impose significant restrictions on the content that can be included in SVoD catalogs, which hampers the ability of content distributors to offer catalogs that cater to the preferences of pirates,” they write.

However, even if more content became available, piracy wouldn’t magically disappear. In the experiment, subscribers were offered free access to a video on demand service. In the real world, they would have to pay, which presents another barrier.

In this study, the pirate households were willing to pay at most $3.25 USD per month to access a service with a library as large as Netflix’s in the United States. That’s not enough.

This leads the researchers to the grim conclusion that video on demand services such as Netflix can’t significantly lower piracy rates. They could make a dent if they increase their content libraries while lowering the price at the same time, but that’s not going to happen.

“Together, our results show that, as a stand-alone strategy, using legal SVoD to curtail piracy will require, at the minimum, offering content much earlier and at much lower prices than those currently offered in the marketplace, changes that are likely to reduce industry revenue and that may damage overall incentives to produce new content while, at the same time, curbing only a small share of piracy,” the researchers conclude.

While Hollywood maintains that people can get pretty much anything they want legally, the current research shows that it’s not as simple as that. Most people are not going to pay for 22 separate subscriptions. Instead of more streaming services, it would be better to make more content available at the ones that are already out there.

The research was partially funded by the Carnegie Mellon University’s IDEA, which receives an unrestricted gift from the MPAA, so Hollywood will likely be clued in on the results.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

Raspberry Pi clusters come of age

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-clusters-come-of-age/

In today’s guest post, Bruce Tulloch, CEO and Managing Director of BitScope Designs, discusses the uses of cluster computing with the Raspberry Pi, and the recent pilot of the Los Alamos National Laboratory 3000-Pi cluster built with the BitScope Blade.

Raspberry Pi cluster

High-performance computing and Raspberry Pi are not normally uttered in the same breath, but Los Alamos National Laboratory is building a Raspberry Pi cluster with 3000 cores as a pilot before scaling up to 40 000 cores or more next year.

That’s amazing, but why?

I was asked this question more than any other at The International Conference for High-Performance Computing, Networking, Storage and Analysis in Denver last week, where one of the Los Alamos Raspberry Pi Cluster Modules was on display at the University of New Mexico’s Center for Advanced Research Computing booth.

The short answer to this question is: the Raspberry Pi cluster enables Los Alamos National Laboratory (LANL) to conduct exascale computing R&D.

The Pi cluster breadboard

Exascale refers to computing systems at least 50 times faster than the most powerful supercomputers in use today. The problem faced by LANL and similar labs building these things is one of scale. To get the required performance, you need a lot of nodes, and to make it work, you need a lot of R&D.

However, there’s a catch-22: how do you write the operating systems, networks stacks, launch and boot systems for such large computers without having one on which to test it all? Use an existing supercomputer? No — the existing large clusters are fully booked 24/7 doing science, they cost millions of dollars per year to run, and they may not have the architecture you need for your next-generation machine anyway. Older machines retired from science may be available, but at this scale they cost far too much to use and are usually very hard to maintain.

The Los Alamos solution? Build a “model supercomputer” with Raspberry Pi!

Think of it as a “cluster development breadboard”.

The idea is to design, develop, debug, and test new network architectures and systems software on the “breadboard”, but at a scale equivalent to the production machines you’re currently building. Raspberry Pi may be a small computer, but it can run most of the system software stacks that production machines use, and the ratios of its CPU speed, local memory, and network bandwidth scale proportionately to the big machines, much like an architect’s model does when building a new house. To learn more about the project, see the news conference and this interview with insideHPC at SC17.

Traditional Raspberry Pi clusters

Like most people, we love a good cluster! People have been building them with Raspberry Pi since the beginning, because it’s inexpensive, educational, and fun. They’ve been built with the original Pi, Pi 2, Pi 3, and even the Pi Zero, but none of these clusters have proven to be particularly practical.

That’s not stopped them being useful though! I saw quite a few Raspberry Pi clusters at the conference last week.

One tiny one that caught my eye was from the people at openio.io, who used a small Raspberry Pi Zero W cluster to demonstrate their scalable software-defined object storage platform, which on big machines is used to manage petabytes of data, but which is so lightweight that it runs just fine on this:

Raspberry Pi Zero cluster

There was another appealing example at the ARM booth, where the Berkeley Labs’ singularity container platform was demonstrated running very effectively on a small cluster built with Raspberry Pi 3s.

Raspberry Pi 3 cluster demo at a conference stall

My show favourite was from the Edinburgh Parallel Computing Center (EPCC): Nick Brown used a cluster of Pi 3s to explain supercomputers to kids with an engaging interactive application. The idea was that visitors to the stand design an aircraft wing, simulate it across the cluster, and work out whether an aircraft that uses the new wing could fly from Edinburgh to New York on a full tank of fuel. Mine made it, fortunately!

Raspberry Pi 3 cluster demo at a conference stall

Next-generation Raspberry Pi clusters

We’ve been building small-scale industrial-strength Raspberry Pi clusters for a while now with BitScope Blade.

When Los Alamos National Laboratory approached us via HPC provider SICORP with a request to build a cluster comprising many thousands of nodes, we considered all the options very carefully. It needed to be dense, reliable, low-power, and easy to configure and to build. It did not need to “do science”, but it did need to work in almost every other way as a full-scale HPC cluster would.

Some people argue Compute Module 3 is the ideal cluster building block. It’s very small and just as powerful as Raspberry Pi 3, so one could, in theory, pack a lot of them into a very small space. However, there are very good reasons no one has ever successfully done this. For a start, you need to build your own network fabric and I/O, and cooling the CM3s, especially when densely packed in a cluster, is tricky given their tiny size. There’s very little room for heatsinks, and the tiny PCBs dissipate very little excess heat.

Instead, we saw the potential for Raspberry Pi 3 itself to be used to build “industrial-strength clusters” with BitScope Blade. It works best when the Pis are properly mounted, powered reliably, and cooled effectively. It’s important to avoid using micro SD cards and to connect the nodes using wired networks. It has the added benefit of coming with lots of “free” USB I/O, and the Pi 3 PCB, when mounted with the correct air-flow, is a remarkably good heatsink.

When Gordon announced netboot support, we became convinced the Raspberry Pi 3 was the ideal candidate when used with standard switches. We’d been making smaller clusters for a while, but netboot made larger ones practical. Assembling them all into compact units that fit into existing racks with multiple 10 Gb uplinks is the solution that meets LANL’s needs. This is a 60-node cluster pack with a pair of managed switches by Ubiquiti in testing in the BitScope Lab:

60-node Raspberry Pi cluster pack

Two of these packs, built with Blade Quattro, and one smaller one comprising 30 nodes, built with Blade Duo, are the components of the Cluster Module we exhibited at the show. Five of these modules are going into Los Alamos National Laboratory for their pilot as I write this.

Bruce Tulloch at a conference stand with a demo of the Raspberry Pi cluster for LANL

It’s not only research clusters like this for which Raspberry Pi is well suited. You can build very reliable local cloud computing and data centre solutions for research, education, and even some industrial applications. You’re not going to get much heavy-duty science, big data analytics, AI, or serious number crunching done on one of these, but it is quite amazing to see just how useful Raspberry Pi clusters can be for other purposes, whether it’s software-defined networks, lightweight MaaS, SaaS, PaaS, or FaaS solutions, distributed storage, edge computing, industrial IoT, and of course, education in all things cluster and parallel computing. For one live example, check out Mythic Beasts’ educational compute cloud, built with Raspberry Pi 3.

For more information about Raspberry Pi clusters, drop by BitScope Clusters.

I’ll read and respond to your thoughts in the comments below this post too.

Editor’s note:

Here is a photo of Bruce wearing a jetpack. Cool, right?!

Bruce Tulloch wearing a jetpack

The post Raspberry Pi clusters come of age appeared first on Raspberry Pi.

Danes Deploy ‘Disruption Machine’ to Curb Online Piracy

Post Syndicated from Ernesto original https://torrentfreak.com/danes-deploy-disruption-machine-to-curb-online-piracy-171119/

Over the years copyright holders have tried a multitude of measures to curb copyright infringement, with varying levels of success.

By now it’s well known that blocking or even shutting down a pirate site doesn’t help much. As long as there are alternatives, people will simply continue to download or stream elsewhere.

Increasingly, major entertainment industry companies are calling for a broader and more coordinated response. They would like to see ISPs, payment processors, advertisers, search engines, and social media companies assisting in their anti-piracy efforts. Voluntarily, or even with a legal incentive, if required.

In Denmark, local anti-piracy group RettighedsAlliancen has a similar goal and they are starting to make progress. The outfit is actively building a piracy “disruption machine” that tackles the issue from as many sides as it can.

The disruption machine is built around an Infringing Website List (IWL), which is not related to a similarly-named initiative from the UK police. This list is made up of pirate sites that have been found to facilitate copyright infringement by a Danish court.

“The IWL is a part of the disruption machine that RettighedsAlliancen has developed in collaboration with many stakeholders in the online community,” the group’s CEO Maria Fredenslund tells TorrentFreak.

The stakeholders include major ISPs, but also media companies, MasterCard, Google, and Microsoft. With help from the local government they signed a Memorandum of Understanding. Their goal is to make the internet a safe and legitimate platform for consumers and businesses while limiting copyright infringement and associated crime.

MoU signees

There are currently twelve court orders on which the list is based and two more are expected to come in before the end of the year. As a result, approximately 600 pirate sites are on the IWL, making them harder to find.

Every time a new court order is handed down, RettighedsAlliancen distributes an updated list to their the network of stakeholders.

“Currently, all major ISPs in Denmark have agreed to implement the IWL in their systems based on a joint Code of Conduct. This means that all the ISPs jointly will block their customers access to infringing services thus amplifying the impact of a blocking order by magnitudes,” Fredenslund explains.

Thus far ISPs are actively blocking 100 pirate sites, resulting in significant traffic drops. The rest of the list has yet to be implemented.

The IWL is also used in the online advertising industry, where several major advertising brokers have signed a joint agreement not to show advertising on these sites. This shuts off part of the revenue streams to pirate sites which, in theory, should make them less profitable.

A similar approach is being taken by major payment providers, who are preventing known pirate sites from processing transactions through their services. Every company has its own measures, but the overlapping goal is to frustrate pirate sites and reduce copyright infringement.

The Disruption Machine

It’s interesting to see that Google is listed as a partner since they don’t support general website blockades. However, Google said that it would demote sites on the IWL in its search results.

While these are all positive developments, according to the anti-piracy group, it’s just the start. RettighedsAlliancen also believes other tools and services could join in. Browser plugins could use the IWL to identify illegal sites, for example, and the options are endless.

“Likewise, large companies, institutions, and public authorities are also well-suited to implement the IWL in their local networks. For example, to prevent students from accessing illegal content while at school or university,” Fredenslund says.

“Looking further ahead, social media platforms such as Facebook are used to a great extent to consume content online and it is therefore obvious that they should also incorporate the IWL in their systems to prevent their users from harm and preventing copyright infringement.”

This model is not completely unique, of course. We’ve seen several elements being implemented in other countries as well, and copyright holders have been pushing voluntary agreements for quite some time now.

What’s new, however, is that it’s clearly defined as a strategy by the Danish group. And by labeling the strategy as a “disruption machine” it already sounds effective, which is part of the job.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

[$] ROCA: Return Of the Coppersmith Attack

Post Syndicated from jake original https://lwn.net/Articles/738896/rss

On October 30, 2017, a group
of Czech researchers from Masaryk University presented the ROCA paper
at the ACM CCS Conference, which earned
the Real-World Impact
Award
. We briefly mentioned ROCA when
it was first reported
but haven’t dug into details of the vulnerability yet. Because of its
far-ranging impact, it seems important to review the vulnerability in
light of the new results published recently.

Google’s Data on Login Thefts

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2017/11/googles_data_on.html

This is interesting research and data:

With Google accounts as a case-study, we teamed up with the University of California, Berkeley to better understand how hijackers attempt to take over accounts in the wild. From March 2016 to March 2017, we analyzed several black markets to see how hijackers steal passwords and other sensitive data.

[…]

Our research tracked several black markets that traded third-party password breaches, as well as 25,000 blackhat tools used for phishing and keylogging. In total, these sources helped us identify 788,000 credentials stolen via keyloggers, 12 million credentials stolen via phishing, and 3.3 billion credentials exposed by third-party breaches.

The report.

New Research in Invisible Inks

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2017/11/new_research_in.html

It’s a lot more chemistry than I understand:

Invisible inks based on “smart” fluorescent materials have been shining brightly (if only you could see them) in the data-encryption/decryption arena lately…. But some of the materials are costly or difficult to prepare, and many of these inks remain somewhat visible when illuminated with ambient or ultraviolet light. Liang Li and coworkers at Shanghai Jiao Tong University may have come up with a way to get around those problems. The team prepared a colorless solution of an inexpensive lead-based metal-organic framework (MOF) compound and used it in an ink-jet printer to create completely invisible patterns on paper. Then they exposed the paper to a methylammonium bromide decryption solution…revealing the pattern…. They rendered the pattern invisible again by briefly treating the paper with a polar solvent….

Full paper.

[$] A report from the Realtime Summit

Post Syndicated from jake original https://lwn.net/Articles/738001/rss

The 2017
Realtime Summit
(RT-Summit) was hosted by the Czech Technical University on
Saturday, October 21 in Prague, just before the Embedded Linux
Conference. It
was attended by more than 50 individuals with backgrounds ranging from
academic to
industrial, and some local students daring enough to spend a day with that
group. Guest author Mathieu Poirier provides summaries of some of the
talks from the summit.

BitBarista: a fully autonomous corporation

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/bitbarista/

To some people, the idea of a fully autonomous corporation might seem like the beginning of the end. However, while the BitBarista coffee machine prototype can indeed run itself without any human interference, it also teaches a lesson about ethical responsibility and the value of quality.

BitBarista

Bitcoin coffee machine that engages coffee drinkers in the value chain

Autonomous corporations

If you’ve played Paperclips, you get it. And in case you haven’t played Paperclips, I will only say this: give a robot one job and full control to complete the task, and things may turn in a very unexpected direction. Or, in the case of Rick and Morty, they end in emotional breakdown.

BitBarista

While the fully autonomous BitBarista resides primarily on the drawing board, the team at the University of Edinburgh’s Center for Design Informatics have built a proof-of-concept using a Raspberry Pi and a Delonghi coffee maker.

BitBarista fully autonomous coffee machine using Raspberry Pi

Recently described by the BBC as ‘a coffee machine with a life of its own, dispensing coffee to punters with an ethical preference’, BitBarista works in conjunction with customers to source coffee and complete maintenance tasks in exchange for BitCoin payments. Customers pay for their coffee in BitCoin, and when BitBarista needs maintenance such as cleaning, water replenishment, or restocking, it can pay the same customers for completing those tasks.

BitBarista fully autonomous coffee machine using Raspberry Pi

Moreover, customers choose which coffee beans the machine purchases based on quality, price, environmental impact, and social responsibility. BitBarista also collects and displays data on the most common bean choices.

BitBarista fully autonomous coffee machine using Raspberry Pi

So not only is BitBarista a study into the concept of full autonomy, it’s also a means of data collection about the societal preference of cost compared to social and environmental responsibility.

For more information on BitBarista, visit the Design Informatics and PETRAS websites.

Home-made autonomy

Many people already have store-bought autonomous technology within their homes, such as the Roomba vacuum cleaner or the Nest Smart Thermostat. And within the maker community, many more still have created such devices using sensors, mobile apps, and microprocessors such as the Raspberry Pi. We see examples using the Raspberry Pi on a daily basis, from simple motion-controlled lights and security cameras to advanced devices using temperature sensors and WiFi technology to detect the presence of specific people.

How to Make a Smart Security Camera with a Raspberry Pi Zero

In this video, we use a Raspberry Pi Zero W and a Raspberry Pi camera to make a smart security camera! The camera uses object detection (with OpenCV) to send you an email whenever it sees an intruder. It also runs a webcam so you can view live video from the camera when you are away.

To get started building your own autonomous technology, you could have a look at our resources Laser tripwire and Getting started with picamera. These will help you build a visitor register of everyone who crosses the threshold a specific room.

Or build your own Raspberry Pi Zero W Butter Robot for the lolz.

The post BitBarista: a fully autonomous corporation appeared first on Raspberry Pi.

Copyright Professor: Don’t Pay Those File-Sharing ‘Fines’

Post Syndicated from Ernesto original https://torrentfreak.com/copyright-professor-dont-pay-those-file-sharing-fines-171027/

In recent years, file-sharers around the world have been pressured to pay significant settlement fees, or face legal repercussions.

Sweden is not spared from these practices. A recent wave of threatening letters, sent out on behalf of film distributors including those behind the zombie movie Cell, targets thousands of local Internet users.

The campaign is coordinated by Danish law firm Njord Law. The company accuses people of downloading the movie without permission and demands a settlement, as is common with these copyright troll schemes.

The scope of the latest campaign is enormous as 20,000 new IP-addresses were collected. Swedish courts can order ISPs to uncover the identities of thousands of IP addresses, in a single batch. That’s quite a lot compared to the US, where the same filmmakers can target only a dozen Internet accounts at a time.

While recipients of these letters can be easily scared by the legal language and proposed 4,500 SEK [$550] settlement, not all experts are impressed.

Sanna Wolk, Intellectual Property Professor at Uppsala University, recommends people to ignore the letters entirely.

“Do not pay. You do not even have to answer it. In the end, it’s the court that will decide whether you have to pay or not. We have seen this type of letter in the past, and only very few times those in charge of the claims have taken it to court,” Wolk tells Ny Teknik.

However, if the case does indeed move beyond a threat and goes to court then it’s important for the accused to contest the claim.

Njord Law says that it will follow up on their ‘promise’ and take people to court if they ignore their settlement requests.

Whether they have the resources to sue thousands of people is questionable though. Similarly, it remains to be seen how good an IP-address is as evidence, since it doesn’t identify a single person, just a connection.

The law firm also highlights that subscribers can be held liable even if someone else used their connection to download the film. However, professor Wolk stresses that this isn’t necessarily true.

“Someone who has an open network cannot be held responsible for copyright violations – such as downloading movies – if they provide others with access to their internet connection. This has been decided in a European Court ruling last year,” she states.

The Copyright Professor refers to the McFadden vs Sony Music ruling where the EU Court of Justice found that the operator of an open WiFi network can’t be held liable for infringements carried out by his users.

National courts have some leeway and could order someone to protect his or her WiFi connection, but this doesn’t mean that they are liable for past infringements.

It’s doubtful that Njord Law and their clients will change their tune. Not all people will read the professor’s comments and their scheme generally thrives on the easily threatened and uninformed. Still, most of the accused will probably sleep better after reading it.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

[$] The state of the realtime union

Post Syndicated from jake original https://lwn.net/Articles/737367/rss

The 2017
Realtime Summit
was held October 21 at Czech Technical University
in Prague to discuss all manner of topics related to realtime Linux.
Nearly two years ago, a collaborative
project
was formed with the goal of mainlining the realtime patch set. At the
summit, project
lead Thomas Gleixner reported on the progress that has been made and the
plans for the future.