Tag Archives: Censorship

The Google Piracy Blame Game is Headache Inducing

Post Syndicated from Andy original https://torrentfreak.com/google-piracy-blame-game-headache-inducing-160717/

google-bayMusic piracy in 2016 is a somewhat curious beast. Streaming platforms are readily accessible and the service provided by outfits like Spotify out-perform the vast majority of pirate sites.

With many legitimate platforms providing an ad-supported free tier, it’s even difficult to complain about the price. Still, some people prefer to pirate and this infuriates the labels, and understandably so. Sadly, however, their response is to blame people that have nothing to do with that infringement.

After being put under intense pressure by copyright holders, Google now feels obliged to let everyone know what measures it’s taking against this kind of piracy. This week it produced a comprehensive report covering every possible angle. Within minutes the record labels had responded, not with thanks, but with intence criticism.

On a personal level I’d like to think that Google is now pretty pissed off, and this is coming from someone who supports artists with subscriptions to Spotify, Deezer and Digitally Imported, and purchases from Beatport and Juno.

For the millionth time, Google does not engage in copyright infringement, yet faced with a problem they can’t solve on their own, the labels have adopted a strategy of painting Google as the villain. The contempt shown by the labels for a company that is already going way beyond what’s required of it under the law is quite unbelievable.

The maddening reality of it all really hits home when one reads a piece penned by the BPI’s Geoff Taylor and published in MBW this week. It begins with complaints that Content ID doesn’t work as well as it should and he invites Google to up its game.

“Despite its amazing innovations in mapping the Earth and inventing driverless cars, Google hasn’t managed to implement a Content ID system that people can’t easily get around,” Taylor complains.

First, Google had no obligation to make Content ID at all but it did and now artists are $2bn better off. Second, people invent systems, people get around them, everyone knows that. But apparently, Google is partly to blame for that too.

“Of course the fact that Google refuses to remove YouTube videos that show you exactly how to circumvent Content ID doesn’t help,” Taylor adds.

No, it’s not helpful, but what it does show is that Google isn’t prepared to stifle free speech, even if it does find it objectionable. Talking about circumventing Content ID is not a crime, nor a breach of YouTube’s terms and conditions. Those videos should stay up, no matter how annoying.

Also, it’s worth bearing in mind that when looking at any industry demands, history shows us that whatever is offered, it will never, ever be enough. Taylor’s piece demonstrates that with flying colors.

“Google should concentrate its formidable resources on making a Content ID system that is genuinely effective in protecting creators; and then apply a similar proactive system to Google search and its other services.”

Proactively censor existence of content on the web. Right. That should be both easy and completely problem free.

To be fair, it’s obvious why the music industry wants Google to go down this route, but the thought of any third party becoming permanent judge and jury over what we can and cannot see online is bewildering. And that’s ignoring the fact that Content ID works for material Google hosts. Applying that to content hosted elsewhere would be a minefield, if not impossible.

But it doesn’t stop there. Also bewildering is how the labels are trying to shame Google into paying them more.

“This isn’t strictly a piracy issue, but we can’t ignore the fact that YouTube pays 1/16th as much for each of its music users as competing services like Spotify,” Taylor writes.

“It’s time that Google started sharing a fair proportion of the value it derives from YouTube with creators.”

In any other marketplace people simply don’t do business with a company if they don’t like the prices being paid, but apparently the labels are being held to ransom.

That being said, since we’re playing this game of “fair proportions”, consider this. YouTube makes pretty much no money. Does the BPI want a share of that?

But the complaint that is perhaps the most frustrating is that the BPI and others are still complaining that pirate sites are turning up in search results for music content.

Let’s be clear, the most popular pirate sites do not turn up in the first results because they’re all being downranked by Google’s anti-piracy algorithm. This means that sites that most people have never heard of get pushed up the list, apparently above legitimate offerings.

That raises the preposterous notion that the people behind many of these bottom tier pirate sites have better SEO skills than the world’s biggest music companies. That being the case, someone needs a kick in the ass – and it’s not Google.

Finally, Taylor criticizes Google for not going after sites that rip audio content from YouTube videos and convert them to MP3s.

“Although such sites breach YouTube’s terms of service and seem to contradict its business model – by turning ad-supported transient streams into permanent copies – Google continues to point to these sites in autocomplete and to host YouTube videos showing how to use them,” he writes.

Again, the BPI is asking for censorship of content that simply isn’t illegal. But more than that it’s yet again demanding action from YouTube when it could take action itself. If these sites are illegal, why aren’t they being added to the UK’s national website blocking list, for example?

The problem with this continual assault on Google is that it’s not only tiresome but it largely misses the point. Google already does way more than the law requires yet it only has control over content hosted on YouTube. No matter what actions it takes, it simply cannot remove illicit content from the web, it can only make it a bit less visible.

Google can look after itself, but copyright holders should be extremely cautious of treating its many overtures with this level of contempt. One volunteer is worth ten pressed men and one can only guess at how much patience Google has left.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Expanding Pirate Site Blocks Spark Censorship Fears

Post Syndicated from Ernesto original https://torrentfreak.com/expanding-pirate-site-blocks-spark-censorship-fears-160714/

blocked-censorLast year Norway joined the ranks of countries where ISPs are ordered to block access to websites on the behest of entertainment industry companies.

In a case started by the Motion Picture Association (MPA), a local court ordered Internet providers to block users’ access to several large ‘pirate’ websites to deter online copyright infringement.

As is often the case with these type of blockades, the Hollywood movie studios didn’t stop at one attempt. They recently went back to court asking for an expansion that would target eight “pirate” streaming sites.

The court granted this request, and as a result WatchSeries, Putlocker, TUBE+, CouchTuner, Watch32, SolarMovie, ProjectFreeTV and Watch Free were added to the national blocklist.

Rune Ljøstad, Partner at the MPA’s lawfirm Simonsen Vogt Wiig, is happy with the outcome which paves the way for similar blocking expansions in the future.

“Together, the decisions create a clear legal basis in Norway to block sites that make copyrighted works available to the public without permission,” Ljøstad says.

While Hollywood is understandably happy, the blocking efforts raise concerns as well. The local Pirate Party, which protested the initial blocks by launching a censorship free DNS server, fears a slippery blocking-slope that may lead to overbroad censorship.

“I’m afraid that blocking sites will have a domino effect,” says Tale Haukbjørk Østrådal, leader of the Norwegian Pirate Party.

“If we block copyright infringement now, what will be the next thing our society accepts to block? The path from blocking torrent sites to censorship is short, and I do not wish to go down that path,” she adds.

The Pirate Party sees blocking as a threat to democracy, as it’s a tool to filter and manipulate what information people can see.

“Censorship is toxic to a democracy. We need to keep the Internet free of censorship, because we need the Internet as a tool to make informed choices. A democracy is failing without informed citizens,” Østrådal notes.

There are alternatives to blocking, according to the Pirate’s leader. The entertainment industries should rethink their business models to compete with piracy, instead of trying to hide it.

“To find the best alternatives the entertainment industry must know why people are sharing, and change their business models. The question isn’t ‘How do we make people pay?’, it is ‘How do we let people pay and feel comfortable with our business model?’”

This means offering more content for a good price, without limitations or artificial boundaries. At the same time artists should use the Internet to connect with fans directly, cutting out the middle-man who profits from their work.

“Personally, I would love to tear down the whole entertainment industry and built it anew. The distributors were never the good guys. They have built an empire by making money from other people’s art,” Østrådal says.

“When we hear the word ‘artist’, we all think of a creative, poor person. It’s fucked up,” she adds.

The Pirate Party’s fears won’t stop Internet providers from complying with the most recent court order.

This means that the streaming sites in question are now a no-go zone. Whether the movie studios have concrete plans to expand the blocking efforts even further is unknown.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Google: Punishing Pirate Sites in Search Results Works

Post Syndicated from Ernesto original https://torrentfreak.com/google-punishing-pirate-sites-in-search-results-works-160713/

googlefightspiracyOver the past few years the entertainment industries have repeatedly asked Google to step up its game when it comes to its anti-piracy efforts.

These calls haven’t fallen on deaf ears and Google has slowly implemented various new anti-piracy measures in response.

Today, Google released an updated version of its “How Google Fights Piracy” report. The company provides an overview of all the efforts it makes to combat piracy while countering some of the entertainment industry complaints.

One of the steps Google has taken in recent years aims to downrank the most egregious “pirate” sites.

To accomplish this, Google made changes to its core algorithms which punish clear offenders. Using the number of accurate DMCA requests as an indicator, these sites are now demoted in search results for certain key phrases.

Despite continuing critique from rightsholders, Google notes that this change has been very effective.

“This process has proven extremely effective. Immediately upon launching improvements to our demotion signal in 2014, one major torrent site acknowledged traffic from search engines had dropped by 50% within the first week,” Google writes, citing one of our articles.

More recently, Google’s own findings confirmed this trend. As a result of the demotion policy, pirate sites lose the vast majority of their Google Search traffic.

“In May 2016, we found that demoted sites lost an average of 89% of their traffic from Google Search. These successes spur us to continue improving and refining the DMCA demotion signal.”

Despite this success, entertainment industry groups have recently called for a more rigorous response. Ideally, they would like Google to remove the results from pirate sites entirely, and make sure that infringing links don’t reappear under a different URL.

However, Google doesn’t want to go this far. The company warns that removing entire sites is dangerous as it may lead to censorship of content that’s perfectly legal.

“Whole-site removal is ineffective and can easily result in the censorship of lawful material,” Google writes.

“Blogging sites, for example, contain millions of pages from hundreds of thousands of users, as do social networking sites, e-commerce sites, and cloud computing services. All can inadvertently contain material that is infringing.”

Similarly, Google doesn’t believe in a “takedown and staydown” approach, where the company would proactively filter search results for pirated content. This would be unfeasible and unnecessary, the company states.

“One problem is that there is no way to know whether something identified as infringing in one place and at one time is also unlawful when it appears at a different place and at a different time,” Google notes.

Instead, the company says that copyright holders should use the existing takedown procedure, and target new sites when they appear so these can be downranked as well.

Finally, Google stresses that search is not a major driver of traffic to pirate sites to begin with. Only a small fraction of users reach these sites through search engines.

While the company is willing to help alleviate the problem, search engines are not the only way to eradicate piracy.

“Search engines do not control what content is on the Web. There are more than 60 trillion web addresses on the internet, and there will always be new sites dedicated to making copyrighted works available as long as there is money to be made doing so.”

Instead of focusing on search, copyright holders should take a “follow the money” approach and make sure that pirate sites are cut off from their revenue sources, Google argues.

In addition, they shouldn’t forget to offer consumers plenty of legal alternatives to piracy.

Convincing the entertainment industries of its good intentions is easier said than done though. “This report looks a lot like “greenwash”,” says Geoff Taylor, Chief Executive of the music industry group BPI.

“Although we welcome the measures Google has taken so far, it is still one of the key enablers of piracy on the planet. Google has the resources and the tech expertise to do much more to get rid of the illegal content on its services,” he adds.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Megaupload 2.0 to Launch With Original Megaupload User Database

Post Syndicated from Andy original https://torrentfreak.com/megaupload-2-0-to-launch-with-original-megaupload-user-database-160708/

megaupload-logoFollowing a few hints earlier this week, it is now fully confirmed. Kim Dotcom will be launching a brand new file-sharing site with a familiar name.

Megaupload 2.0 is pencilled in for a January 2017 launch, an event that will coincide with the 2012 closure of the original Megaupload and the massive police raid on its operators.

Having successfully avoided the clutches of a hungry United States government for half a decade, this five-year anniversary is an important one for Dotcom, and it’s becoming clear he hopes to celebrate it with another poke in the eye for the Obama administration.

Details are few at this stage, but here’s what we know. Megaupload 2.0 will have 100gb of free storage. It will allow users to sync all of their devices and there will be no data transfer limits. On-the-fly encryption will be baked-in.

But while site features are important, what the original Megaupload had going for it was millions of loyal users. They were all made homeless and scattered when the site was shut down but according to Dotcom, there will be a future grand reunion.

Intriguingly, the serial entrepreneur says that Megaupload 2.0 will get a fantastic start in life. Rather than simply relying on word-of-mouth advertising to get going, his new venture will launch with the original Megaupload user database intact.

How Dotcom managed to preserve a copy of this data isn’t clear, but he says that each user account held within will get a foot up.

“Most Megaupload accounts will be reinstated with Premium privileges on the new Megaupload,” Dotcom announced this morning.

If every one of those former Megaupload users hit the site on day one, that’s 100 million people needing attention. It’s unlikely that anywhere near that will come aboard, but just one or two percent would be a tremendous start.

But hosting files isn’t the only thing on Dotcom’s mind. His censorship-resistant MegaNet project is still in development and although it’s not going to be ready until 2018 at the earliest, Dotcom says that Megaupload 2.0 will be a crucial component of that network.

“Megaupload 2.0 will be the launch platform for MegaNet. Let’s make sure that we have critical mass first. #100MillionUsers,” he said this morning.

Dotcom clearly has much work to do and even flat-out will struggle to meet his January deadline. Still, he doesn’t intend to do it alone.

“To former Megaupload and current Mega employees. We welcome you with open arms. Mega App developers, we have a great deal for you. Ping me,” he wrote a few hours ago.

So how will former Megaupload users know if they can use their old credentials to access the new site?

“Expect an email,” Dotcom concludes.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Kim Dotcom Hints at Second Coming of Megaupload

Post Syndicated from Andy original https://torrentfreak.com/kim-dotcom-hints-at-second-coming-of-megaupload-160706/

dotcom-laptopWith multiple legal cases underway in several jurisdictions, Kim Dotcom is undoubtedly a man with things on his mind.

In New Zealand, he’s fighting extradition to the United States. And in the United States he’s fighting a government that wants to bring him to justice on charges of copyright infringement, conspiracy, money laundering and racketeering.

After dramatically launching and then leaving his Mega file-hosting site following what appears to have been an acrimonious split, many believed that Dotcom had left the file-sharing space for good. But after a period of quiet, it now transpires that the lure of storing data has proven too much of a temptation for the businessman.

In a follow-up to previous criticism of his former company, earlier today Dotcom took another shot at Mega. That was quickly followed by a somewhat surprising announcement.

“A new site is in the making. 100gb free cloud storage,” Dotcom said.

Intrigued, TorrentFreak spoke with Dotcom to find out more. Was he really planning to launch another file-sharing platform?

“I can say that this year I have set things in motion and a new cloud storage site is currently under development,” Dotcom confirmed.

“I’m excited about the new innovations the site will contain.”

When pressed on specific features for the new platform, Dotcom said it was too early to go into details. However, we do know that the site will enable users to sync all of their devices and there will be no data transfer limits.

For the privacy-conscious, Dotcom also threw out another small bone, noting that the site will also feature on-the-fly encryption. Given the German’s attention to security in recent years, it wouldn’t be a surprise if additional features are added before launch.

“Eight years of knowledge and a long planning period went into this. It will be my best creation yet,” Dotcom told TF.

A potential launch date for the site hasn’t been confirmed but the Megaupload and Mega founder is currently teasing the hashtag #5thRaidAnniversary, suggesting that his new project will launch in January 2017, five years after the Megaupload raids.

Of course, we also asked Dotcom if he’d decided on a name for his new cloud-storage site. Typically he’s playing his cards close to his chest and leaving us to fill in the blanks, but he hinted that an old name with a big reputation might be making a comeback.

“The name of the new site will make people happy,” he told us.

TF will be getting a sneak peek at the site when it’s ready for launch but in the meantime, readers might be wondering what has happened to Dotcom’s censorship-resistant MegaNet project.

“Mobile networks and devices still have to catch up with my vision for MegaNet and it will probably not be before 2018 until a beta goes live,” Dotcom concludes.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

KickassTorrents Enters The Dark Web, Adds Official Tor Address

Post Syndicated from Ernesto original https://torrentfreak.com/kickasstorrents-enters-the-dark-web-adds-official-tor-address-160607/

kickassWith millions of visitors per day KickassTorrents (KAT) is currently the most visited torrent site on the Internet.

As a result, copyright holders have taken aim at the site in recent years, resulting in ISP blockades in the UK, Finland and elsewhere. Soon, even Australia may be added to this list.

While these blocks are somewhat effective, there are also plenty of ways to circumvent them. KAT itself is operating various proxy sites, for example, and today it steps up its unblocking efforts by joining the dark web.

Through a newly launched domain KAT users can now access their favorite site on the Tor network. Tor, which stands for The Onion Router, is an encrypted anonymity network that can’t be easily blocked by ISPs.

“Good news for those who have difficulties accessing KAT due to the site block in their country, now you can always access KAT via this address lsuzvpko6w6hzpnn.onion on a TOR network,” Mr. White announces.

kattor

Tor users can access regular websites, but also dedicated Tor sites that use an .onion address. People who want to access these addresses have to be connected to the Tor network, through the special Tor browser for example.

TorrentFreak spoke to KAT’s Mr. White who informs us that an .onion address was added by popular request, making it easier for users to bypass even the strictest blockades.

KAT is not the first torrent site to become active on the Tor network. The Pirate Bay has had an .onion address for several years already. In addition, there are also several smaller torrent and warez communities active on the dark web.

Thus far the response from KAT users has been mostly positive, with many welcoming bridge to the dark web.

“This is fantastic news. I had quite some difficulties trying to log in. Now no more,” one user notes. Another one adds, “welcome to Tor KAT family, nothing beats sailing on the dark net.”

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Takedown, Staydown Would Be a Disaster, Internet Archive Warns

Post Syndicated from Andy original https://torrentfreak.com/takedown-staydown-would-be-a-disaster-internet-archive-warns-160607/

Currently there is a huge and coordinated effort by the world’s major copyright holders to push for changes to the Digital Millennium Copyright Act (DMCA).

In a nutshell, key entertainment industry players believe that the DMCA is no longer fit for purpose and has been twisted out of shape by pirate sites, Google and even YouTube, to work against their best interests.

One of the main problems is taking down infringing content. The legislation allows content to be removed following the issuing of a so-called DMCA notice, but copyright holders say that this descends into a game of whac-a-mole, with content repeatedly reappearing.

To end this cycle they’re pushing for a new mechanism provisionally titled ‘Takedown, Staydown’ or ‘Notice and Staydown’. This would order web platforms to ensure that once content is taken down it will never appear again on the same platform. These proposals are currently under review by the US Copyright Office.

But while copyright holders feel this would be a great tool for them, it’s perhaps unsurprising that content platforms are less enthusiastic. After weighing in earlier in the year, the latest warnings from the Internet Archive, a gigantic public repository of a wide range of media, and are among the sternest yet.

Noting that even the current system is regularly abused by those seeking to silence speech, the Archive says that on a daily basis it receives wrongful takedowns for content that is in the public domain, is fair use, or is critical of the content owner. Therefore, further extending takedown rights could prove extremely problematic.

“We were very concerned to hear that the Copyright Office is strongly considering recommending changing the DMCA to mandate a ‘Notice and Staydown’ regime. This is the language that the Copyright Office uses to talk about censoring the web,” the Archive warns.

The Archive has a number of concerns but key issues involve due process and user monitoring. Once a platform is in receipt of a “staydown” order, it will be required to ensure that content never reappears, regardless of the context in which it does so. This means that users posting content subject to fair use exceptions will effectively be denied their right to issue a counter-notice when their upload is blocked, thus trampling due process.

But of course, blocking content also requires that users are monitored, and the Internet Archive doesn’t like that idea at all.

“The current statute protects user privacy by explicitly stating that platforms have no duty to monitor user activity for copyright infringement. Notice and Staydown would change this – requiring platforms to be constantly looking over users’ shoulders,” the Archive warns.

With free speech potentially at stake here, the Internet Archive says that taking content down and keeping it down has constitutional implications.

“Notice and Staydown has a serious First Amendment problem. The government mandating the use of technology to affirmatively take speech offline before it’s even posted, without any form of review, potentially violates free speech laws,” it says.

Such an automated system would amount to a censorship “black box”, the Archive adds, to which the public would be denied the key.

“It would be very difficult to know how much legitimate activity was being censored.”

Fair use has come up time and time again during this DMCA debate and the Internet Archive is clearly very concerned that it receives protection. Worried that content filtering technology isn’t even up to today’s challenges, the Archive warns that systems that can identify instances of fair use simply don’t exist.

“So far, no computer algorithm has been developed that can determine whether a particular upload is fair use. Notice and Staydown would force many cases of legitimate fair use off the web,” it warns.

“Further, intermediaries are not the right party to be implementing this technology. They don’t have all the facts about the works, such as whether they have been licensed. Most platforms are not in a good position to be making legal judgments, and they are motivated to avoid the potential for high statutory damages. All this means that platforms are likely to filter out legitimate uses of content.”

Finally, there is the not insignificant matter of who is going to pay for all of these systems should platforms be forced to adopt them. While copyright holders would apparently reap the benefits, sites like the Internet Archive would probably be expected to foot the bill.

“Developing an accurate filter that will work for each and every platform on the web will be an extremely costly endeavor. Nonprofits, libraries, and educational institutions who act as internet service providers would be forced to spend a huge amount of their already scarce resources policing copyright,” the Archive warns.

“The DMCA has its problems, but Notice and Staydown would be an absolute disaster,” it concludes.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Axl Rose Sends DMCA Notices to Google Targeting ‘Fat’ Photo

Post Syndicated from Andy original https://torrentfreak.com/axl-rose-sends-dmca-notices-to-google-targeting-fat-photo-160605/

censoredAs regularly documented in these pages, copyright holders expend a lot of energy trying to protect their work from Internet piracy.

The tried and tested method is to issue a DMCA takedown notice to webhosts and platforms such as Google, Facebook and YouTube. Millions of these requests are sent and processed every week.

However, while copyright holders are fully entitled to protect their work, there are many instances that cause controversy. These cases often amount to ham-handed efforts at taking down infringing content but others arouse suspicions that censorship is the likely goal.

Details of several such cases appeared in the Lumen Database’s DMCA archive this week, having been filed there by Google. They all relate to a wave of copyright claims sent to Blogspot and GoogleUserContent on May 31, 2016 demanding the removal of pictures depicting Guns N’ Roses singer Axl Rose.

“Copyright image of Axl Rose. Please be advised that no permission has been granted to publish the copyright image so we cannot direct you to an authorized example of it,” the notices sent by Web Sheriff on behalf of the singer read.

axl1

Each notice (1,2,3,4,5,6) relates to the same image, an excellently framed but rather unflattering picture of Axl Rose taken at the MTS Centre, Winnipeg, Canada, back in 2010.

axl2

Intrigued, TorrentFreak tracked down the photographer who captured this moment to see if he was aware of these takedown efforts. We eventually found Boris Minkevich at the Winnipeg Free Press where his fine work is published in all its glory.

During our initial discussions a few things became clear. Firstly, Minkevich definitely took the photo. Second, Minkevich had no idea that Rose was trying to “cleanse the web” of his photo.

Perhaps the first reaction here is that Rose has no right to take down Minkevich’s photo. Since Minkevich was the one who took it, he must own the copyright, right? Web Sheriff doesn’t seem to think so.

“We can gladly confirm that all official / accredited photographers at [Axl Rose] shows sign-off on ‘Photography Permission’ contracts / ‘Photographic Release’ agreements which A. specify and limit the manner in which the photos can be exploited and B. transfer copyright ownership in such photos to AR’s relevant service company,” the company told TF in a statement.

We contacted Minkevich again and asked whether he’d signed any contracts as suggested by Web Sheriff or had any clear idea of who owns the copyrights. He confirmed that some shows make photographers sign an agreement and some don’t. This event was in 2010, a long time to remember back.

However, even if Minkevich took this photograph in an unofficial and/or unauthorized capacity, Web Sheriff still believes there would be issues surrounding ownership.

“[If a photographer] was there and taking shots without permission or authority, then other considerations / factors would come-into-play as to what such individuals can and cannot do in terms of attempting to commercially exploit the resultant images of someone else’s show,” TF was informed.

So while the waters about who owns what continue to swirl, the big question remains – why target the picture at all? Understandably, Web Sheriff told us that client work is confidential but it’s certainly possible that part of the puzzle lies a quick Google search away.

As can been seen below, the photographs taken by Mr Minkevich all those years ago also triggered a viral Axl Rose ‘fat’ meme – hardly the kind of image someone like Axle Rose would like to preserve.

axl3

While poking fun at someone’s appearance is sadly par for the course on some parts of the Internet, sending DMCA notices is hardly likely to cure the problem, if indeed that’s what the aim of the half-dozen notices was. It’s possible we’ll never find out for sure.

Finally, it’s worth pointing out that Google hasn’t complied with the requests to remove the images and all remain up and accessible. That may be because Google believes that Axl Rose doesn’t own the photo and that the copyrights sit with Minkevich and/or the Winnipeg Free Press.

Clearly Axl Rose thinks otherwise but as pointed out by Minkevich to TF, the images being targeted on Blogspot are definitely infringing, although perhaps not in the way Axl might’ve hoped.

“Either way the photo was stolen off our website with no permission granted by the Winnipeg Free Press,” he concludes.

Messy? You bet.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Fox ‘Stole’ a Game Clip, Used it in Family Guy & DMCA’d the Original

Post Syndicated from Andy original https://torrentfreak.com/fox-stole-a-game-clip-used-it-in-family-guy-dmcad-the-original-160520/

familyguyJust when you think you’ve seen every ridiculous example of a bogus DMCA-style takedown, another steps up to take the crown. This week’s abomination comes courtesy of Fox and it’s an absolute disaster.

In last Sunday’s episode of Family Guy titled “Run, Chris, Run“, Peter and Cleveland play the 1980s classic Nintendo video game Double Dribble. Peter doesn’t play fair though and exploits a glitch in the game that allows his player to shoot a three-point goal every time. The clip is available on YouTube.

Perhaps surprisingly the game glitch is absolutely genuine and was documented in a video that was uploaded to YouTube by a user called ‘sw1tched’ back in February 2009.

“This is an automatic shot my brothers and I found on the NES Double Dribble back in the 80’s when it was released. I know others know this also, but as long as you release at the right point it is automatic. The half court shot I took at the end goes in 80% of the time, but i didn’t want to keep recording….HAHA,” sw1tched wrote.

Interestingly the clip that was uploaded by sw1tched was the exact same clip that appeared in the Family Guy episode on Sunday. So, unless Fox managed to duplicate the gameplay precisely, Fox must’ve taken the clip from YouTube.

Whether Fox can do that and legally show the clip in an episode is a matter for the experts to argue but what followed next was patently absurd. Shortly after the Family Guy episode aired, Fox filed a complaint with YouTube and took down the Double Dribble video game clip on copyright grounds. (mirror Daily Motion)

doubledribble-1

Faced with yet another example of a blatantly wrongful takedown, TorrentFreak spoke with Fight for the Future CTO Jeff Lyon. Coincidentally he’d just watched the episode in question.

“It’s most likely that this is just another example of YouTube’s Content ID system automatically taking down a video without regard to actual copyright ownership and fair use. As soon as FOX broadcast that Family Guy episode, their robots started taking down any footage that appeared to be reposted from the show — and in this case they took down the footage they stole from an independent creator,” Lyon says.

“The problem with an automated DMCA takedown system is that robots can never know the difference between fair use and copyright infringement. It is not hyperbolic to call this mass censorship,” he continues.

“Instead of copyright holders having to prove a video is infringing, their scanning software can take it down automatically, and then it falls on the creator to prove they had a right to post it. Creators are discouraged from filing counter-notices to stand up for their work, facing lost revenue and permanent bans from online platforms. This erodes fair use and free speech on the Internet.”

The entire situation is indeed bewildering and utterly ridiculous. The original Double Dribble game came out in 1987, some 12 years before the very first episode of Family Guy aired in 1999. The clip of the glitch was uploaded by sw1tched more than seven years ago. Then somehow Fox came along, copied it, put it into their TV show, claimed copyright on it, and then nuked the original clip from the Internet.

You couldn’t make it up. Nor would you want to.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Copyright Holders Dominate Closed-Door DMCA Hearings

Post Syndicated from Andy original https://torrentfreak.com/copyright-holders-dominate-closed-door-dmca-hearings-160518/

Earlier this year the U.S. Government ran a public consultation to evaluate the effectiveness of the DMCA’s Safe Harbor provisions. These include issues such as ‘notice and takedown’ plus short-comings and abuses that arise from the current system.

In the final days of the consultation Fight for the Future (FFTF) and popular YouTube channel Channel Awesome launched a campaign urging the public to get involved. What followed was a massive response to the U.S. Copyright Office coordinated via the associated TakedownAbuse site. But that was just the beginning.

Thanks to the huge support FFTF and Channel Awesome (CA) were able to convince the U.S. Copyright Office (USCO) to give them seats at the table in a series of closed-door meetings on DMCA reforms held in San Francisco last week. Jeff Lyon (FFTF) and Mike Michaud (CA) attended and they report that discussion was heavily skewed in favor of copyright owners.

“Unfortunately, the hearings appeared to be rigged against the public interest, and unless we step up our game, it’s looking very likely that the USCO will make the DMCA even worse, with major giveaways to the copyright industry that put SOPA-style restrictions on independent content creators,” Lyon reports.

The FFTF CEO says that while Google, EFF and Mozilla were in attendance (pdf), they were outnumbered by pro-copyright groups including the MPAA, RIAA, Copyright Alliance (who previously labeled FFTF’s campaign participants as “zombies“), Creative Future, Disney, Paramount and NBCUniversal.

Speaking with TorrentFreak, Lyon says that one of the key copyright industry demands is for a “take down, stay down” system which would require platform owners to proactively police user-uploaded content.

“I can say for sure that there was overwhelming consensus in favor of ‘take down, stay down’ from members of the discussions affiliated with the copyright industry,” Lyon says.

“The idea is that once a copyright holder files a DMCA takedown for a particular piece of content, for example a music clip, it should then become the responsibility of the website operator to proactively scan everything uploaded by users and block that content from being posted in the future.”

Lyon says that this would effectively eliminate a user’s right to file a counter-notice, since they would be unable to post any content with a copyright claim against it, even in a fair use situation.

“Being unable to post copyrighted content also means users would be less able to sue copyright holders to assert a fair use right, since the content would be blocked by the website, instead of being taken down by a legal claim made by the copyright holder,” he explains.

TorrentFreak asked Lyon if copyright holders had made any suggestions on how such a complex system could work in practice. Apparently some feel it is Google’s problem.

“Keith Kupfershmid of Copyright Alliance said something to the effect of ‘I’m not a techie, but if they can make self driving cars, they can surely figure out how to keep copyrighted material from being posted,” he said.

However, also in attendance was Tony Rodriguez of anti-piracy outfit Digimarc. Lyon says that Rodriguez suggested that his company has the ability to deal with the job.

“It was strongly implied that Digimarc’s scanning technology could be adapted for use by website owners to comply with staydown requirements. I think Digimarc is practically salivating at the prospect of being in control over a government-mandated copyright protection racket, where they can serve both copyright holders and website owners who are held hostage by new staydown rules,” he explained.

“Overall the attitude was that it should be the tech industry’s problem to figure out how to do it and pay for it. Nobody had a good answer for determining fair use scenarios programmatically.”

While physically outnumbered by copyright holders, Google senior copyright counsel Fred von Lohman agreed with Jeff Lyon that content filtering technology is extremely expensive and burdensome for website owners to develop, noting that Google had spent over $40 million and deployed 100 software engineers to develop its Content ID system. Others weighed in too.

“One of the best points was made by Daphne Keller from Stanford Law,” Lyon says.

“She backed up my and von Lohman’s claims that content scanning systems are generally expensive, but added that good content scanning algorithms that could protect fair use rights will be very expensive. Thus if sites are required to implement content scanning, they will be incentivized to use cheap options that would err on the side of filtering out lawful content and fair use.”

Interestingly, sitting right next to Lyon in one of the sessions was MPAA attorney Dean Marks. He appeared to have SOPA on his mind.

“After some light-hearted joking banter with the regulators, the MPAA attorney suggested new legislation to take down entire websites (aka SOPA) for suspected copyright infringement,” Lyon explains.

“He spoke briefly and near the end of the meeting, so it was really almost in passing. He did not get into specifics about overseas websites [per SOPA], only mentioned that torrent sites only exist to spread pirated material and should be taken down completely.”

Overall, Lyon says he gets the impression that rather like with the SOPA debate, these DMCA discussions are being framed as “copyright industry vs. tech industry”, something which undermines the public interest. Nevertheless, FFTF and groups including EFF are putting up a fight.

“The general public is more affected by the DMCA than they even know. Copyright holders are abusing the process to censor negative reviews and commentary from the Internet. Creators are discouraged from fighting back, facing lost revenue and permanent bans from online platforms,” Lyon says.

“The sheer number of copyright holders at the meetings allowed them to push the discussions toward their ‘take down, stay down’ agenda, and they repeatedly tried to discredit nearly 100,000 comments sent by the public calling them out for mass censorship and abuse of the existing DMCA takedown rules. The copyright industry is clearly engaging in a massive lobbying effort to bring new SOPA-style legislation back in front of Congress.”

FFTF acknowledge that their opponents are powerful lobbying forces but they believe they have the tools and the public backing to put up a fight.

“We’ve beat them before and we can do it again. The copyright industry was blindsided by nearly 100,000 comments sent to the Copyright Office in the span of one day. When the next round of public commenting opens up, we will be ready, and our voices will be impossible to ignore,” Lyon concludes.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Documenting the Chilling Effects of NSA Surveillance

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2016/04/documenting_the.html

In Data and Goliath, I talk about the self-censorship that comes along with broad surveillance. This interesting research documents this phenomenon in Wikipedia: “Chilling Effects: Online Surveillance and Wikipedia Use,” by Jon Penney, Berkeley Technology Law Journal, 2016.

Abstract: This article discusses the results of the first empirical study providing evidence of regulatory “chilling effects” of Wikipedia users associated with online government surveillance. The study explores how traffic to Wikipedia articles on topics that raise privacy concerns for Wikipedia users decreased after the widespread publicity about NSA/PRISM surveillance revelations in June 2013. Using an interdisciplinary research design, the study tests the hypothesis, based on chilling effects theory, that traffic to privacy-sensitive Wikipedia articles reduced after the mass surveillance revelations. The Article finds not only a statistically significant immediate decline in traffic for these Wikipedia articles after June 2013, but also a change in the overall secular trend in the view count traffic, suggesting not only immediate but also long-term chilling effects resulting from the NSA/PRISM online surveillance revelations. These, and other results from the case study, not only offer compelling evidence for chilling effects associated with online surveillance, but also offer important insights about how we should understand such chilling effects and their scope, including how they interact with other dramatic or significant events (like war and conflict) and their broader implications for privacy, U.S. constitutional litigation, and the health of democratic society. This study is among the first to demonstrate — using either Wikipedia data or web traffic data more generally­ how government surveillance and similar actions impact online activities, including access to information and knowledge online.

Two news stories.

Pirate Bay’s Image Hosting Site ‘Bayimg’ Returns, For a Bit

Post Syndicated from Ernesto original http://feedproxy.google.com/~r/Torrentfreak/~3/zBjnSO0XcLw/

pirate bayWhen a Pirate Bay server was raided late 2014 several related projects were pulled offline as well, including the site’s image hosting service Bayimg and Pastebay.

While the torrent site itself eventually returned after two months, the other sites remained offline. However, a few days ago something changed.

Without an official announcement Bayimg resurfaced as if nothing ever happened. Suddenly, former users could access their images again and upload new files, although the latter may not be wise.

TorrentFreak reached out to the TPB team to find out what the plans are, and we were informed that the comeback is only temporary.

The site will remain online for a week or so. This allows people to secure their files, if needed, but in a few days the site will close its doors again. Apparently, the TPB team prefers to focus exclusively on the torrent site.

Bayimg

bayimg600

This means that the image hosting service won’t celebrate its tenth anniversary next year.

Bayimg was founded in 2007 as one of many TPB side-projects and promoted as a censorship free hosting platform. It was particularly popular among torrent uploaders, who used it to host screenshots.

However, history has shown that not all Pirate Bay projects are finished, and they certainly don’t always survive. Responding to this criticism the Imgbay team listed a response in its FAQ, which still applies today.

“We do whatever we want, whenever we want. If it doesn’t suit you, you can start your own empire,” the team said back in 2007.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

RuTracker and Sci-Hub Nominated for Free Knowledge Award

Post Syndicated from Ernesto original http://feedproxy.google.com/~r/Torrentfreak/~3/LSiWbZuVdgg/

wikiruFor the third year in a row the official Russian Wikimedia chapter is awarding a prize to a person or organization that made ​​a notable contribution in line with the goals of the Wikimedia movement.

Earlier this month Wikimedia announced the nominations for the Wiki “Free Knowledge” award, which includes Russia’s largest torrent tracker RuTracker as well as Sci-Hub founder Alexandra Elbakyan.

Elbakyan made headlines around the world after she was sued by Elsevier, one of the largest academic publishers. Through Sci-Hub she offer millions of academic articles, which are usually behind a paywall, free of charge.

“Everyone should have access to knowledge regardless of their income or affiliation. And that’s absolutely legal. The idea that knowledge can be a private property of some commercial company sounds absolutely weird to me,” she told us last year.

Sci-Hub

sci-hub

This deviant stance is supported by many scientists who are calling for more open access to research findings, and this also got her a nomination for Wikimedia Russia’s Free Knowledge award.

“For many Russian scientists this project is in fact the only opportunity to quickly familiarize themselves with scientific articles, especially given the economic events of the last couple of years,” one commenter noted during the nomination process.

In total there are nine nominees, including a Russian State Library project, the Russian Ministry of Defense and the hugely popular torrent site RuTracker.

In recent months RuTracker has pushed back hard against legal pressure from various sides and various censorship efforts. According to some Wikimedia members, the site deserves to be awarded for its role in freely spreading Russian culture.

“I know hundreds of writers who through RuTracker distribute their own works: musicians, directors, writers, scientists, teachers, photographers and others,” a commenter noted during the nomination process, applauding the site’s free knowledge approach.

Not everyone agrees with the nomination of RuTracker though. Another member highlighted the numerous copyright violations which run contrary to the ideas of the Wikimedia Foundation, calling the nomination “unacceptable and absurd.”

The members of the Russian Wikimedia chapter will now weigh the pros and cons for each of the nominees. The winner will be announced next month during the award ceremony.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Opera Browser Adds Free and Unlimited VPN

Post Syndicated from Ernesto original http://feedproxy.google.com/~r/Torrentfreak/~3/IemBItu6G-c/

vpnonBack in 2006 Opera was the first major browser to include BitTorrent support, and today it releases another feature that will appeal to millions of users.

The company has added a free and unlimited VPN to the developer version of its browser. This means that users can browse the web securely at the flick of a switch.

Privacy aside, the built-in VPN is also an ideal tool to circumvent website blockades. This may come in handy for the aforementioned BitTorrent users as well, as sites such as The Pirate Bay are blocked in many countries.

The VPN connection is provided by the Canadian VPN service SurfEasy, which like many other VPNs keeps no logs. SurfEasy was acquired by Opera last year and COO Steve Kelly tells TorrentFreak that privacy and censorship were the main reasons to add the free VPN to Opera.

“Everyone deserves to surf privately online if they want to. Today, it is too difficult to maintain privacy when using the web, and way too many people experience roadblocks online, like blocked content,” Kelly says.

“By releasing an integrated, free and unlimited VPN in the browser, we make it simple for people to enhance their privacy and access the content they want,” he adds.

It is worth highlighting that the VPN connection is limited to the web browser. This means that any content shared outside the browser, through traditional torrent clients for example, is not private.

Opera’s in-browser VPN uses AES-256 encryption and SurfEasy says that the initial response has been very strong. The network is prepared to handle hundreds of thousands of simultaneous connections without any problems.

With the addition of a VPN feature Opera hopes to set a new standard for modern browsers. Earlier, it was already the first major browser to include an ad-blocker.

“This is the first VPN option integrated into a major browser. Also, it’s delivered from a company you can trust, with an extensive history of providing reliable and trustworthy internet products,” Kelly told us.

More details about the built-in VPN are available at the Opera blog. People who want to give it a spin should download the latest developer release, as the feature is not available in the regular version yet.

Opera’s VPN feature

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Copyright Group Likens Massive DMCA Abuse Protests to “Zombie Apocalypse”

Post Syndicated from Ernesto original http://feedproxy.google.com/~r/Torrentfreak/~3/FoLGKp5vlwE/

copyright-bloodIn recent years there have been a lot of complaints about the current state of the DMCA takedown process.

To hear the growing concerns from all sides, the U.S. Copyright Office launched a public consultation in order to evaluate the impact and effectiveness of the 1998 copyright law.

Just before the deadline expired last week, Fight for the Future (FFTF) and popular YouTube channel ChannelAwesome decided to join in. They launched a campaign through which people could protest DMCA abuse, triggering over 90,000 responses in less than 24 hours.

The public interest was so overwhelming that the Government’s servers reportedly “crashed” under the heavy load.

The protest organizers were delighted to see that so many people had voiced their concerns. Up until they got involved there had only been a few dozen responses so their efforts made a huge impact.

However, copyright holders and industry groups are not pleased with the public outcry. Earlier this week Keith Kupferschmid, CEO of the Hollywood funded Copyright Alliance, likened it to a “Copyright Zombie Apocalypse.”

“Well, in case you were unconscious and left for dead in a hospital last week, the copyright community experienced its own zombie apocalypse,” Kupferschmid writes.

His main complaint is that nearly all comments were sent through the TakedownAbuse campaign site, where people could send in the pre-filled form highlighting various abuse related problems.

“These 90,000 comments are all identical submissions generated merely by clicking on the ‘I’m in’ button at takedownabuse.org. Like the zombies in The Walking Dead, there was not a lot of effort or brainpower that went into the 90,000 plus submissions,” he notes.

“If there are problems with the DMCA the best way to understand what those problems are, and to attempt to address them, is for those with concerns to voice them in detail and not file yet another zombie comment. As we’ve learned from The Walking Dead, those zombies are rather easily disposed of.”

While Kupferschmid certainly has a point when he argues that the massive number of responses is unlikely to generate a broad range of insights, the harsh wording appears to be a sign of bitter frustration.

Knowing that tens of thousands of people share a certain point of view has value, and the Copyright Office is clever enough to take the context into account.

Interestingly, however, Kupferschmid notes that he would say the same if the comments were voicing pro-copyright sentiments.

This is rather ironic because the Copyright Alliance is actively promoting several pro-copyright campaigns that also allow the public to sign pre-written petitions. Unlike the form at TakedownAbuse.org, people can’t even edit the message. Like “zombies,” all they are encouraged to do is sign.

TorrentFreak spoke to FFTF’s Tiffiniy Cheng, who notes that people did edit or add their own comments. In any case, equating tens of thousands of concerned citizens to zombies might not be best move.

“The expression of a disagreement with a certain policy is valuable to our democracy and debate. And, that’s what we have here,” Cheng says.

“The people who filed comments have experienced real censorship that they want to stop and care deeply about stopping DMCA takedown abuse. You can’t discount that, they are getting organized and demanding a seat at the table the best way they know how – by coming together and showing how big this problem is,” she adds.

After the comment deadline passed the Takedownabuse campaign received thousands of additional comments. They plan to submit these additional responses to the Copyright Office as a petition.

Perhaps the Copyright Alliance should join in, rally some “zombies,” and launch a petition of their own?

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Music Industry: DMCA Copyright Law is Obsolete and Harmful

Post Syndicated from Ernesto original http://feedproxy.google.com/~r/Torrentfreak/~3/JmOuNPvPdmI/

cassetteSigned into law by President Bill Clinton in 1998, the Digital Millenium Copyright Act (DMCA) aimed to ready copyright law for the digital age.
The law introduced a safe harbor for Internet services, meaning that they can’t be held liable for their pirating users as long as they properly process takedown notices and deal with repeat infringers.
However, in recent years copyright holders, Internet services and the public in general have signaled various shortcomings. On the one hand, rightsholders believe that the law doesn’t do enough to protect creators, while the opposing side warns of increased censorship and abuse.
To hear the growing concerns from all sides the U.S. Copyright Office launched a public consultation in order to evaluate the impact and effectiveness of the DMCA’s safe harbor provisions.
A few hours ago a broad coalition of 400 artists and music groups, including the RIAA, Music Publishers Association and A2IM submitted their response. The 70-page brief provides a comprehensive overview of what the music industry sees as the DMCA shortcomings while calling for significant reform.
“The Music Community’s list of frustrations with the DMCA is long,” the groups write, adding that “a law that might have made sense in 1998 is now not only obsolete but actually harmful.”
The music industry’s comments focus heavily on search engines, Google in particular. In recent years music companies have sent hundreds of millions of takedown notices to Google, but despite these efforts, copyright infringing material is still topping many search results.
“The notice-and-takedown system has proved an ineffective tool for the volume of unauthorized digital music available, something akin to bailing out an ocean with a teaspoon,” they write.
“Copyright owners should not be required to engage in the endless game of sending repeat takedown notices to protect their works, simply because another or the
same infringement of the initially noticed work appears at a marginally different URL than the first time.”
The music groups are calling for advanced technologies and processes to ensure that infringing content doesn’t reappear elsewhere once it’s removed, a so-called “notice and stay down” approach.
This includes audio fingerprinting technologies, hash-matching technologies, meta-data correlations and the removal of links that point to content which has been taken down already.
“The current standard of ‘URL by URL’ takedown does not make sense in a world where there is an infinite supply of URLs,” the groups add.
Another problem with the DMCA, according to the music companies, is that the safe harbor provision also protects sites that are clearly profiting from copyright infringement.
Describing it as a “get out of jail free” card for many dubious sites, RIAA and the others demand change.
“At its worst, the DMCA safe harbors have become a business plan for profiting off of stolen content; at best, the system is a de facto government subsidy enriching some digital services at the expense of creators. This almost 20 year-old, 20th Century law should be updated,” they write.
The music industry groups note that these and other issues have turned the DMCA law into a “dysfunctional relic,” and are calling on Congress to take action and come up with a copyright law that better protects their interests.
The anti-DMCA comments submitted to the U.S. Government are the strongest we’ve seen thus far, but more responses are expected to be published after the deadline passes today.
Where most copyright holders call for stricter anti-piracy measures, many Internet services and activists are expected to focus on the increase on DMCA abuse and censorship.
Earlier this week a Google-funded report revealed that close to 30% of all DMCA requests it receives are “questionable” and the EFF previously called on the public to share their DMCA horror stories.
In addition, Fight for the Future just launched a campaign page, helping the public to inform the Copyright Office that DMCA abuses should be stopped. The campaign generated over 50,000 comments in a day, ‘crashing’ the Government’s website.
Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.


What is hacker culture?

Post Syndicated from Matthew Garrett original http://mjg59.dreamwidth.org/38746.html

Eric Raymond, author of The Cathedral and the Bazaar (an important work describing the effectiveness of open collaboration and development), recently wrote a piece calling for “Social Justice Warriors” to be ejected from the hacker community. The primary thrust of his argument is that by calling for a removal of the “cult of meritocracy”, these SJWs are attacking the central aspect of hacker culture – that the quality of code is all that matters.This argument is simply wrong.Eric’s been involved in software development for a long time. In that time he’s seen a number of significant changes. We’ve gone from computers being the playthings of the privileged few to being nearly ubiquitous. We’ve moved from the internet being something you found in universities to something you carry around in your pocket. You can now own a computer whose CPU executes only free software from the moment you press the power button. And, as Eric wrote almost 20 years ago, we’ve identified that the “Bazaar” model of open collaborative development works better than the “Cathedral” model of closed centralised development.These are huge shifts in how computers are used, how available they are, how important they are in people’s lives, and, as a consequence, how we develop software. It’s not a surprise that the rise of Linux and the victory of the bazaar model coincided with internet access becoming more widely available. As the potential pool of developers grew larger, development methods had to be altered. It was no longer possible to insist that somebody spend a significant period of time winning the trust of the core developers before being permitted to give feedback on code. Communities had to change in order to accept these offers of work, and the communities were better for that change.The increasing ubiquity of computing has had another outcome. People are much more aware of the role of computing in their lives. They are more likely to understand how proprietary software can restrict them, how not having the freedom to share software can impair people’s lives, how not being able to involve themselves in software development means software doesn’t meet their needs. The largest triumph of free software has not been amongst people from a traditional software development background – it’s been the fact that we’ve grown our communities to include people from a huge number of different walks of life. Free software has helped bring computing to under-served populations all over the world. It’s aided circumvention of censorship. It’s inspired people who would never have considered software development as something they could be involved in to develop entire careers in the field. We will not win because we are better developers. We will win because our software meets the needs of many more people, needs the proprietary software industry either can not or will not satisfy. We will win because our software is shaped not only by people who have a university degree and a six figure salary in San Francisco, but because our contributors include people whose native language is spoken by so few people that proprietary operating system vendors won’t support it, people who live in a heavily censored regime and rely on free software for free communication, people who rely on free software because they can’t otherwise afford the tools they would need to participate in development.In other words, we will win because free software is accessible to more of society than proprietary software. And for that to be true, it must be possible for our communities to be accessible to anybody who can contribute, regardless of their background.Up until this point, I don’t think I’ve made any controversial claims. In fact, I suspect that Eric would agree. He would argue that because hacker culture defines itself through the quality of contributions, the background of the contributor is irrelevant. On the internet, nobody knows that you’re contributing from a basement in an active warzone, or from a refuge shelter after escaping an abusive relationship, or with the aid of assistive technology. If you can write the code, you can participate.Of course, this kind of viewpoint is overly naive. Humans are wonderful at noticing indications of “otherness”. Eric even wrote about his struggle to stop having a viscerally negative reaction to people of a particular race. This happened within the past few years, so before then we can assume that he was less aware of the issue. If Eric received a patch from someone whose name indicated membership of this group, would there have been part of his subconscious that reacted negatively? Would he have rationalised this into a more critical analysis of the patch, increasing the probability of rejection? We don’t know, and it’s unlikely that Eric does either.Hacker culture has long been concerned with good design, and a core concept of good design is that code should fail safe – ie, if something unexpected happens or an assumption turns out to be untrue, the desirable outcome is the one that does least harm. A command that fails to receive a filename as an argument shouldn’t assume that it should modify all files. A network transfer that fails a checksum shouldn’t be permitted to overwrite the existing data. An authentication server that receives an unexpected error shouldn’t default to granting access. And a development process that may be subject to unconscious bias should have processes in place that make it less likely that said bias will result in the rejection of useful contributions.When people criticise meritocracy, they’re not criticising the concept of treating contributions based on their merit. They’re criticising the idea that humans are sufficiently self-aware that they will be able to identify and reject every subconscious prejudice that will affect their treatment of others. It’s not a criticism of a desirable goal, it’s a criticism of a flawed implementation. There’s evidence that organisations that claim to embody meritocratic principles are more likely to reward men than women even when everything else is equal. The “cult of meritocracy” isn’t the belief that meritocracy is a good thing, it’s the belief that a project founded on meritocracy will automatically be free of bias.Projects like the Contributor Covenant that Eric finds so objectionable exist to help create processes that (at least partially) compensate for our flaws. Review of our processes to determine whether we’re making poor social decisions is just as important as review of our code to determine whether we’re making poor technical decisions. Just as the bazaar overtook the cathedral by making it easier for developers to be involved, inclusive communities will overtake “pure meritocracies” because, in the long run, these communities will produce better output – not just in terms of the quality of the code, but also in terms of the ability of the project to meet the needs of a wider range of people.The fight between the cathedral and the bazaar came from people who were outside the cathedral. Those fighting against the assumption that meritocracies work may be outside what Eric considers to be hacker culture, but they’re already part of our communities, already making contributions to our projects, already bringing free software to more people than ever before. This time it’s Eric building a cathedral and decrying the decadent hordes in their bazaar, Eric who’s failed to notice the shift in the culture that surrounds him. And, like those who continued building their cathedrals in the 90s, it’s Eric who’s now irrelevant to hacker culture.(Edited to add: for two quite different perspectives on why Eric’s wrong, see Tim’s and Coraline’s posts)comment count unavailable comments

The CA’s Role in Fighting Phishing and Malware

Post Syndicated from Let's Encrypt - Free SSL/TLS Certificates original https://letsencrypt.org//2015/10/29/phishing-and-malware.html

Since we announced Let’s Encrypt we’ve often been asked how we’ll ensure that we don’t issue certificates for phishing and malware sites. The concern most commonly expressed is that having valid HTTPS certificates helps these sites look more legitimate, making people more likely to trust them.

Deciding what to do here has been tough. On the one hand, we don’t like these sites any more than anyone else does, and our mission is to help build a safer and more secure Web. On the other hand, we’re not sure that certificate issuance (at least for Domain Validation) is the right level on which to be policing phishing and malware sites in 2015. This post explains our thinking in order to encourage a conversation about the CA ecosystem’s role in fighting these malicious sites.

CAs Make Poor Content Watchdogs

Let’s Encrypt is going to be issuing Domain Validation (DV) certificates. On a technical level, a DV certificate asserts that a public key belongs to a domain – it says nothing else about a site’s content or who runs it. DV certificates do not include any information about a website’s reputation, real-world identity, or safety. However, many people believe the mere presence of DV certificate ought to connote at least some of these things.

Treating a DV certificate as a kind of “seal of approval” for a site’s content is problematic for several reasons.

First, CAs are not well positioned to operate anti­-phishing and anti-malware operations – or to police content more generally. They simply do not have sufficient ongoing visibility into sites’ content. The best CAs can do is check with organizations that have much greater content awareness, such as Microsoft and Google. Google and Microsoft consume vast quantities of data about the Web from massive crawling and reporting infrastructures. This data allows them to use complex machine learning algorithms (developed and operated by dozens of staff) to identify malicious sites and content.

Even if a CA checks for phishing and malware status with a good API, the CA’s ability to accurately express information regarding phishing and malware is extremely limited. Site content can change much faster than certificate issuance and revocation cycles, phishing and malware status can be page-specific, and certificates and their related browser UIs contain little, if any, information about phishing or malware status. When a CA doesn’t issue a certificate for a site with phishing or malware content, users simply don’t see a lock icon. Users are much better informed and protected when browsers include anti-phishing and anti-malware features, which typically do not suffer from any of these limitations.

Another issue with treating DV certificates as a “seal of approval” for site content is that there is no standard for CA anti­-phishing and anti-malware measures beyond a simple blacklist of high-­value domains, so enforcement is inconsistent across the thousands of CAs trusted by major browsers. Even if one CA takes extraordinary measures to weed out bad sites, attackers can simply shop around to different CAs. The bad guys will almost always be able to get a certificate and hold onto it long enough to exploit people. It doesn’t matter how sophisticated the best CA anti­-phishing and anti-malware programs are, it only matters how good the worst are. It’s a “find the weakest link” scenario, and weak links aren’t hard to find.

Browser makers have realized all of this. That’s why they are pushing phishing and malware protection features, and evolving their UIs to more accurately reflect the assertions that certificates actually make.

TLS No Longer Optional

When they were first developed in the 1990s, HTTPS and SSL/TLS were considered “special” protections that were only necessary or useful for particular kinds of websites, like online banks and shopping sites accepting credit cards. We’ve since come to realize that HTTPS is important for almost all websites. It’s important for any website that allows people to log in with a password, any website that tracks its users in any way, any website that doesn’t want its content altered, and for any site that offers content people might not want others to know they are consuming. We’ve also learned that any site not secured by HTTPS can be used to attack other sites.

TLS is no longer the exception, nor should it be. That’s why we built Let’s Encrypt. We want TLS to be the default method for communication on the Web. It should just be a fundamental part of the fabric, like TCP or HTTP. When this happens, having a certificate will become an existential issue, rather than a value add, and content policing mistakes will be particularly costly. On a technical level, mistakes will lead to significant down time due to a slow issuance and revocation cycle, and features like HSTS. On a philosophical and moral level, mistakes (innocent or otherwise) will mean censorship, since CAs would be gatekeepers for online speech and presence. This is probably not a good role for CAs.

Our Plan

At least for the time being, Let’s Encrypt is going to check with the Google Safe Browsing API before issuing certificates, and refuse to issue to sites that are flagged as phishing or malware sites. Google’s API is the best source of phishing and malware status information that we have access to, and attempting to do more than query this API before issuance would almost certainly be wasteful and ineffective.

We’re going to implement this phishing and malware status check because many people are not comfortable with CAs entirely abandoning anti-phishing and anti-malware efforts just yet, even for DV certificates. We’d like to continue the conversation for a bit longer before we abandon what many people perceive to be an important CA behavior, even though we disagree.

Conclusion

The fight against phishing and malware content is an important one, but it does not make sense for CAs to be on the front lines, at least when it comes to DV certificates. That said, we’re going to implement checks against the Google Safe Browsing API while we continue the conversation.

We look forward to hearing what you think. Please let us know.