Tag Archives: Africa

Dish Network Files Two Lawsuits Against Pirate IPTV Providers

Post Syndicated from Andy original https://torrentfreak.com/dish-network-files-two-lawsuits-against-pirate-iptv-providers-180103/

In broad terms, there are two types of unauthorized online streaming of live TV. The first is via open-access websites where users can view for free. The second features premium services to which viewers are required to subscribe.

Usually available for a few dollars, euros, or pounds per month, the latter are gaining traction all around the world. Service levels are relatively high and the majority of illicit packages offer a dazzling array of programming, often putting official providers in the shade.

For this reason, commercial IPTV providers are considered a huge threat to broadcasters’ business models, since they offer a broadly comparable and accessible service at a much cheaper price. This is forcing companies such as US giant Dish Networks to court, seeking relief.

Following on from a lawsuit filed last year against Kodi add-on ZemTV and TVAddons.ag, Dish has just filed two more lawsuits targeting a pair of unauthorized pirate IPTV services.

Filed in Maryland and Texas respectively, the actions are broadly similar, with the former targeting a provider known as Spider-TV.

The suit, filed against Dima Furniture Inc. and Mohammad Yusif (individually and collectively doing business as Spider-TV), claims that the defendants are “capturing
broadcasts of television channels exclusively licensed to DISH and are unlawfully retransmitting these channels over the Internet to their customers throughout the United States, 24 hours per day, 7 days per week.”

Dish claim that the defendants profit from the scheme by selling set-top boxes along with subscriptions, charging around $199 per device loaded with 13 months of service.

Dima Furniture is a Maryland corporation, registered at Takoma Park, Maryland 20912, an address that is listed on the Spider-TV website. The connection between the defendants is further supported by FCC references which identify Spider devices in the market. Mohammad Yusif is claimed to be the president, executive director, general manager, and sole shareholder of Dima Furniture.

Dish describes itself as the fourth largest pay-television provider in the United States, delivering copyrighted programming to millions of subscribers nationwide by means of satellite delivery and over-the-top services. Dish has acquired the rights to do this, the defendants have not, the broadcaster states.

“Defendants capture live broadcast signals of the Protected Channels, transcode these signals into a format useful for streaming over the Internet, transfer the transcoded content to one or more servers provided, controlled, and maintained by Defendants, and then transmit the Protected Channels to users of the Service through
OTT delivery, including users in the United States,” the lawsuit reads.

It’s claimed that in July 2015, Yusif registered Spider-TV as a trade name of Dima Furniture with the Department of Assessments and Taxation Charter Division, describing the business as “Television Channel Installation”. Since then, the defendants have been illegally retransmitting Dish channels to customers in the United States.

The overall offer from Spider-TV appears to be considerable, with a claimed 1,300 channels from major regions including the US, Canada, UK, Europe, Middle East, and Africa.

Importantly, Dish state that the defendants know that their activities are illegal, since the provider sent at least 32 infringement notices since January 20, 2017 demanding an end to the unauthorized retransmission of its channels. It went on to send even more to the defendants’ ISPs.

“DISH and Networks sent at least thirty-three additional notices requesting the
removal of infringing content to Internet service providers associated with the Service from February 16, 2017 to the filing of this Complaint. Upon information and belief, at least some of these notices were forwarded to Defendants,” the lawsuit reads.

But while Dish says that the takedowns responded to by the ISPs were initially successful, the defendants took evasive action by transmitting the targeted channels from other locations.

Describing the defendants’ actions as “willful, malicious, intentional [and] purposeful”, Dish is suing for Direct Copyright Infringement, demanding a permanent injunction preventing the promotion and provision of the service plus statutory damages of $150,000 per registered work. The final amount isn’t specified but the numbers are potentially enormous. In addition, Dish demands attorneys’ fees, costs, and the seizure of all infringing articles.

The second lawsuit, filed in Texas, is broadly similar. It targets Mo’ Ayad Al
Zayed Trading Est., and Mo’ Ayad Fawzi Al Zayed (individually and collectively doing business as Tiger International Company), and Shenzhen Tiger Star Electronical Co., Ltd, otherwise known as Shenzhen Tiger Star.

Dish claims that these defendants also illegally capture and retransmit channels to customers in the United States. IPTV boxes costing up to $179 including one year’s service are the method of delivery.

In common with the Maryland case, Dish says it sent almost two dozen takedown notices to ISPs utilized by the defendants. These were also countered by the unauthorized service retransmitting Dish channels from other servers.

The biggest difference between the Maryland and Texas cases is that while Yusif/Spider/Dima Furniture are said to be in the US, Zayed is said to reside in Amman, Jordan, and Tiger Star is registered in Shenzhen, China. However, since the unauthorized service is targeted at customers in Texas, Dish states that the Texas court has jurisdiction.

Again, Dish is suing for Direct Infringement, demanding damages, costs, and a permanent injunction.

The complaints can be found here and here.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

AWS Direct Connect Update – Ten New Locations Added in Late 2017

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-direct-connect-update-ten-new-locations-added-in-late-2017/

Happy 2018! I am looking forward to getting back to my usual routine, working with our teams to learn about their upcoming launches and then writing blog posts to bring the news to you. Right now I am still catching up on a few launches and announcements from late 2017.

First on the list for today is our most recent round of new cities for AWS Direct Connect. AWS customers all over the world use Direct Connect to create dedicated network connections from their premises to AWS in order to reduce their network costs, increase throughput, and to pursue a more consistent network experience.

We added ten new locations to our Direct Connect roster in December, all of which offer both 1 Gbps and 10 Gbps connectivity, along with partner-supplied options for speeds below 1 Gbps. Here are the newest locations, along withe the data centers and associated AWS Regions:

  • Bangalore, India – NetMagic DC2Asia Pacific (Mumbai).
  • Cape Town, South Africa – Teraco Ct1EU (Ireland).
  • Johannesburg, South Africa – Teraco JB1EU (Ireland).
  • London, UK – Telehouse North TwoEU (London).
  • Miami, Florida, US – Equinix MI1US East (Northern Virginia).
  • Minneapolis, Minnesota, US – Cologix MIN3US East (Ohio)
  • Ningxia, China – Shapotou IDC – China (Ningxia).
  • Ningxia, China – Industrial Park IDC – China (Ningxia).
  • Rio de Janeiro, Brazil – Equinix RJ2South America (São Paulo).
  • Tokyo, Japan – AT Tokyo ChuoAsia Pacific (Tokyo).

You can use these new locations in conjunction with the AWS Direct Connect Gateway to set up connectivity that spans Virtual Private Clouds (VPCs) spread across multiple AWS Regions (this does not apply to the AWS Regions in China).

If you are interested in putting Direct Connect to use, be sure to check out our ever-growing list of Direct Connect Partners.

Jeff;

Journeying with green sea turtles and the Arribada Initiative

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/sea-turtles/

Today, a guest post: Alasdair Davies, co-founder of Naturebytes, ZSL London’s Conservation Technology Specialist and Shuttleworth Foundation Fellow, shares the work of the Arribada Initiative. The project uses the Raspberry Pi Zero and camera module to follow the journey of green sea turtles. The footage captured from the backs of these magnificent creatures is just incredible – prepare to be blown away!

Pit Stop Camera on Green Sea Turtle 01

Footage from the new Arribada PS-C (pit-stop camera) video tag recently trialled on the island of Principe in unison with the Principe Trust. Engineered by Institute IRNAS (http://irnas.eu/) for the Arribada Initiative (http://blog.arribada.org/).

Access to affordable, open and customisable conservation technologies in the animal tracking world is often limited. I’ve been a conservation technologist for the past ten years, co-founding Naturebytes and working at ZSL London Zoo, and this was a problem that continued to frustrate me. It was inherently expensive to collect valuable data that was necessary to inform policy, to designate marine protected areas, or to identify threats to species.

In March this year, I got a supercharged opportunity to break through these barriers by becoming a Shuttleworth Foundation Fellow, meaning I had the time and resources to concentrate on cracking the problem. The Arribada Initiative was founded, and ten months later, the open source Arribada PS-C green sea turtle tag was born. The video above was captured two weeks ago in the waters of Principe Island, West Africa.

Alasdair Davies on Twitter

On route to Principe island with 10 second gen green sea #turtle tags for testing. This version has a video & accelerometer payload for behavioural studies, plus a nice wireless charging carry case made by @institute_irnas @ShuttleworthFdn

The tag comprises a Raspberry Pi Zero W sporting the Raspberry Pi camera module, a PiRA power management board, two lithium-ion cells, and a rather nice enclosure. It was built in unison with Institute IRNAS, and there’s a nice user-friendly wireless charging case to make it easy for the marine guards to replace the tags after their voyages at sea. When a tag is returned to one of the docking stations in the case, we use resin.io to manage it, download videos, and configure the tag remotely.

Green Sea Turtle Alasdair Davies Raspberry Pi
Green Sea Turtle Alasdair Davies Raspberry Pi

The tags can also be configured to take video clips at timed intervals, meaning we can now observe the presence of marine litter, plastic debris, before/after changes to the ocean environment due to nearby construction, pollution, and other threats.

Discarded fishing nets are lethal to sea turtles, so using this new tag at scale – now finally possible, as the Raspberry Pi Zero helps to drive down costs dramatically whilst retaining excellent video quality – offers real value to scientists in the field. Next year we will be releasing an optimised, affordable GPS version.

green sea turtle Alasdair Davies Raspberry Pi Arribada Initiative

To make this all possible we had to devise a quicker method of attaching the tag to the sea turtles too, so we came up with the “pit-stop” technique (which is what the PS in the name “Arribada PS-C” stands for). Just as a Formula 1 car would visit the pits to get its tyres changed, we literally switch out the tags on the beach when nesting females return, replacing them with freshly charged tags by using a quick-release base plate.

Alasdair Davies on Twitter

About 6 days left now until the first tagged nesting green sea #turtles return using our latest “pit-stop” removeable / replaceable tag method. Counting down the days @arribada_i @institute_irnas

To implement the system we first epoxy the base plate to the turtle, which minimises any possible stress to the turtles as the method is quick. Once the epoxy has dried we attach the tag. When the turtle has completed its nesting cycle (they visit the beach to lay eggs three to four times in a single season, every 10–14 days on average), we simply remove the base plate to complete the field work.

Green Sea Turtle Alasdair Davies Raspberry Pi
Green Sea Turtle Alasdair Davies Raspberry Pi

If you’d like to watch more wonderful videos of the green sea turtles’ adventures, there’s an entire YouTube playlist available here. And to keep up to date with the initiative, be sure to follow Arribada and Alasdair on Twitter.

The post Journeying with green sea turtles and the Arribada Initiative appeared first on Raspberry Pi.

CoderDojo: 2000 Dojos ever

Post Syndicated from Giustina Mizzoni original https://www.raspberrypi.org/blog/2000-dojos-ever/

Every day of the week, we verify new Dojos all around the world, and each Dojo is championed by passionate volunteers. Last week, a huge milestone for the CoderDojo community went by relatively unnoticed: in the history of the movement, more than 2000 Dojos have now been verified!

CoderDojo banner — 2000 Dojos

2000 Dojos

This is a phenomenal achievement for a movement that’s just six years old and powered by volunteers. Presently, there are more than 1650 active Dojos running weekly, fortnightly, or monthly, and all of them are free for participants — for example, the Dojos run by Joel Bayubasire in Kampala, Uganda:

Joel Bayubasire with Ninjas at his Ugandan Dojo — 2000 Dojos

Empowering refugee children

This week, Joel set up his second Dojo and verified it on our global map. Joel is a Congolese refugee living in Kampala, Uganda, where he is currently completing his PhD in Economics at Madison International Institute and Business School.

Joel understands first-hand the challenges faced by refugees who were forced to leave their country due to war or conflict. Uganda is currently hosting more than 1.2 million refugees, 60% of which are children (World Bank, 2017). As refugees, children are only allowed to attend local schools until the age of 12. This results in lower educational attainment, which will likely affect their future employment prospects.

Two girls at a laptop. Joel Bayubasire CoderDojo — 2000 Dojos

Joel has the motivation to overcome these challenges, because he understands the power of education. Therefore, he initiated a number of community-based activities to provide educational opportunities for refugee children. As part of this, he founded his first Dojo earlier in the year, with the aim of giving these children a chance to compete in today’s global knowledge-based economy.

Two boys at a laptop. Joel Bayubasire CoderDojo — 2000 Dojos

Aware that securing volunteer mentors would be a challenge, Joel trained eight young people from the community to become youth mentors to their peers. He explains:

I believe that the mastery of computer coding allows talented young people to thrive professionally and enables them to not only be consumers but creators of the interconnected world of today!

Based on the success of Joel’s first Dojo, he has now expanded the CoderDojo initiative in his community; his plan is to provide computer science training for more than 300 refugee youths in Kampala by 2019. If you’d like to learn more about Joel’s efforts, head to this website.

Join the movement

If you are interested in creating opportunities for the young people in your community, then join the growing CoderDojo movement — you can volunteer to start a Dojo or to support an existing one today!

The post CoderDojo: 2000 Dojos ever appeared first on Raspberry Pi.

Copyright Holders Want ISPs to Police Pirate Sites and Issue Warnings

Post Syndicated from Ernesto original https://torrentfreak.com/copyright-holders-want-isps-to-police-pirate-sites-and-issue-warnings-171124/

Online piracy is a worldwide phenomenon and increasingly it ends up on the desks of lawmakers everywhere.

Frustrated by the ever-evolving piracy landscape, copyright holders are calling on local authorities to help out.

This is also the case in South Africa at the moment, where the Government is finalizing a new Cybercrimes and Cybersecurity Bill.

Responding to a call for comments, anti-piracy group SAFACT, film producers, and local broadcaster M-Net seized the opportunity to weigh in with some suggestions. Writing to the Department of Justice and Constitutional Development, they ask for measures to make it easier to block pirate sites and warn copyright infringers.

“A balanced approach to address the massive copyright infringement on the Internet is necessary,” they say.

On the site-blocking front, the copyright holder representatives suggest an EU-style amendment that would allow for injunctions against ISPs to bar access to pirate sites.

“It is suggested that South Africa should consider adopting technology-neutral ‘no fault’ enforcement legislation that would enable intermediaries to take action against online infringements, in line with Article 8.3 of the EU Copyright Directive (2001/29/EC), which addresses copyright infringement through site blocking,” it reads.

Request and response (via Business Tech)

In addition, ISPs should also be obliged to take further measures to deter piracy. New legislation should require providers to “police” unauthorized file-sharing and streaming sites, and warn subscribers who are caught pirating.

“Obligations should be imposed on ISPs to co-operate with rights-holders and Government to police illegal filesharing or streaming websites and to issue warnings to end-users identified as engaging in illegal file-sharing and to block infringing content,” the rightsholders say.

The demands were made public by the Department recently, which also included an official response from the Government. While the suggestions are not dismissed based on their content, they don’t fit the purpose of the legislation.

“The Bill does not deal with copyright infringements. These aspects must be dealt with in terms of copyright-related legislation,” the Department writes.

SAFACT, the filmmakers, and M-Net are not without options though. The Government points out that the new Copyright Amendment Bill, which was introduced recently, would be a better fit for these asks. So it’s likely that they will try again.

This doesn’t mean that any of the proposed language will be adopted, of course. However, now that the demands are on the table, South Africans are likely to hear more blocking and warning chatter in the near future.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

Build a Flick-controlled marble maze

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/flick-marble-maze/

Wiggle your fingers to guide a ball through a 3D-printed marble maze using the Pi Supply Flick board for Raspberry Pi!

Wiggle, wiggle, wiggle, wiggle, yeah

Using the Flick, previously seen in last week’s Hacker House’s gesture-controlled holographic visualiser, South Africa–based Tom Van den Bon has created a touch-free marble maze. He was motivated by, if his Twitter is any indication, his love for game-making and 3D printing.

Tom Van den Bon on Twitter

Day 172 of #3dprint365. #3dprinted Raspberry PI Controlled Maze Thingie Part 3 #3dprint #3dprinter #thingiverse #raspberrypi #pisupply

All non-electronic parts of this build are 3D printed. The marble maze sits atop a motorised structure which moves along two axes thanks to servo motors. Tom controls the movement using gestures which are picked up by the Flick Zero, a Pi Zero–sized 3D-tracking board that can detect movement up to 15cm away.

Find the code for the maze, which takes advantage of the Flick library, on Tom’s GitHub account.

Make your own games

Our free resources are a treasure trove of fun home-brew games that you can build with your friends and family.

If you like physical games such as Tom’s gesture-controlled maze, you should definitely check out our Python quick reaction game! In it, players are pitted against each other to react as quickly as possible to a randomly lighting up LED.

raspberry pi marble maze

You can also play solo with our Lights out game, where it’s you against four erratic lights eager to remain lit.

For games you can build on your computer with no need for any extra tech, Scratch games such as our button-smashing Olympic weightlifter and Hurdler projects are perfect — you can play them just using a keyboard and browser!

raspberry pi marble maze

And if you’d like to really get stuck into learning about game development, then you’re in luck! CoderDojo’s Make your own game book guides you through all the steps of building a game in JavaScript, from creating the world to designing characters.

Cover of CoderDojo Nano Make your own game

And because I just found this while searching for image content for today’s blog, here is a photo of Eben’s and Liz’s cat Mooncake with a Raspberry Pi on her head. Enjoy!

A cat with a Raspberry Pi pin on its head — raspberry pi marble maze

Ras-purry Pi?

The post Build a Flick-controlled marble maze appeared first on Raspberry Pi.

Hot Startups on AWS – October 2017

Post Syndicated from Tina Barr original https://aws.amazon.com/blogs/aws/hot-startups-on-aws-october-2017/

In 2015, the Centers for Medicare and Medicaid Services (CMS) reported that healthcare spending made up 17.8% of the U.S. GDP – that’s almost $3.2 trillion or $9,990 per person. By 2025, the CMS estimates this number will increase to nearly 20%. As cloud technology evolves in the healthcare and life science industries, we are seeing how companies of all sizes are using AWS to provide powerful and innovative solutions to customers across the globe. This month we are excited to feature the following startups:

  • ClearCare – helping home care agencies operate efficiently and grow their business.
  • DNAnexus – providing a cloud-based global network for sharing and managing genomic data.

ClearCare (San Francisco, CA)

ClearCare envisions a future where home care is the only choice for aging in place. Home care agencies play a critical role in the economy and their communities by significantly lowering the overall cost of care, reducing the number of hospital admissions, and bending the cost curve of aging. Patients receiving home care typically have multiple chronic conditions and functional limitations, driving over $190 billion in healthcare spending in the U.S. each year. To offset these costs, health insurance payers are developing in-home care management programs for patients. ClearCare’s goal is to help home care agencies leverage technology to improve costs, outcomes, and quality of life for the aging population. The company’s powerful software platform is specifically designed for use by non-medical, in-home care agencies to manage their businesses.

Founder and CEO Geoff Nudd created ClearCare because of his own grandmother’s need for care. Keeping family members and caregivers up to date on a loved one’s well being can be difficult, so Geoff created what is now ClearCare’s Family Room, which enables caregivers and agency staff to check schedules and receive real-time updates about what’s happening in the home. Since then, agencies have provided feedback on others areas of their businesses that could be streamlined. ClearCare has now built over 20 modules to help home care agencies optimize operations with services including a telephony service, billing and payroll, and more. ClearCare now serves over 4,000 home care agencies, representing 500,000 caregivers and 400,000 seniors.

Using AWS, ClearCare is able to spin up reliable infrastructure for proofs of concept and iterate on those systems to quickly get value to market. The company runs many AWS services including Amazon Elasticsearch Service, Amazon RDS, and Amazon CloudFront. Amazon EMR and Amazon Athena have enabled ClearCare to build a Hadoop-based ETL and data warehousing system that processes terabytes of data each day. By utilizing these managed services, ClearCare has been able to go from concept to customer delivery in less than three months.

To learn more about ClearCare, check out their website.

DNAnexus (Mountain View, CA)

DNAnexus is accelerating the application of genomic data in precision medicine by providing a cloud-based platform for sharing and managing genomic and biomedical data and analysis tools. The company was founded in 2009 by Stanford graduate student Andreas Sundquist and two Stanford professors Arend Sidow and Serafim Batzoglou, to address the need for scaling secondary analysis of next-generation sequencing (NGS) data in the cloud. The founders quickly learned that users needed a flexible solution to build complex analysis workflows and tools that enable them to share and manage large volumes of data. DNAnexus is optimized to address the challenges of security, scalability, and collaboration for organizations that are pursuing genomic-based approaches to health, both in clinics and research labs. DNAnexus has a global customer base – spanning North America, Europe, Asia-Pacific, South America, and Africa – that runs a million jobs each month and is doubling their storage year-over-year. The company currently stores more than 10 petabytes of biomedical and genomic data. That is equivalent to approximately 100,000 genomes, or in simpler terms, over 50 billion Facebook photos!

DNAnexus is working with its customers to help expand their translational informatics research, which includes expanding into clinical trial genomic services. This will help companies developing different medicines to better stratify clinical trial populations and develop companion tests that enable the right patient to get the right medicine. In collaboration with Janssen Human Microbiome Institute, DNAnexus is also launching Mosaic – a community platform for microbiome research.

AWS provides DNAnexus and its customers the flexibility to grow and scale research programs. Building the technology infrastructure required to manage these projects in-house is expensive and time-consuming. DNAnexus removes that barrier for labs of any size by using AWS scalable cloud resources. The company deploys its customers’ genomic pipelines on Amazon EC2, using Amazon S3 for high-performance, high-durability storage, and Amazon Glacier for low-cost data archiving. DNAnexus is also an AWS Life Sciences Competency Partner.

Learn more about DNAnexus here.

-Tina

Twitter makers love Halloween

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/twitter-love-halloween/

Halloween is almost upon us! In honour of one of the maker community’s favourite howlidays, here are some posts from enthusiastic makers on Twitter to get you inspired and prepared for the big event.

Lorraine’s VR Puppet

Lorraine Underwood on Twitter

Using a @Raspberry_Pi with @pimoroni tilt hat to make a cool puppet for #Halloween https://t.co/pOeTFZ0r29

Made with a Pimoroni Pan-Tilt HAT, a Raspberry Pi, and some VR software on her phone, Lorraine Underwood‘s puppet is going to be a rather fitting doorman to interact with this year’s trick-or-treaters. Follow her project’s progress as she posts it on her blog.

Firr’s Monster-Mashing House

Firr on Twitter

Making my house super spooky for Halloween! https://t.co/w553l40BT0

Harnessing the one song guaranteed to earworm its way into my mind this October, Firr has upgraded his house to sing for all those daring enough to approach it this coming All Hallows’ Eve.

Firr used resources from Adafruit, along with three projectors, two Raspberry Pis, and some speakers, to create this semi-interactive display.

While the eyes can move on their own, a joystick can be added for direct control. Firr created a switch that goes between autonomous animation and direct control.

Find out more on the htxt.africa website.

Justin’s Snake Eyes Pumpkin

Justin Smith on Twitter

First #pumpkin of the season for Friday the 13th! @PaintYourDragon’s snake eyes bonnet for the #RaspberryPi to handle the eye animation. https://t.co/TSlUUxYP5Q

The Animated Snake Eyes Bonnet is definitely one of the freakiest products to come from the Adafruit lab, and it’s the perfect upgrade for any carved pumpkin this Halloween. Attach the bonnet to a Raspberry Pi 3, or the smaller Zero or Zero W, and thus add animated eyes to your scary orange masterpiece, as Justin Smith demonstrates in his video. The effect will terrify even the bravest of trick-or-treaters! Just make sure you don’t light a candle in there too…we’re not sure how fire-proof the tech is.

And then there’s this…

EmmArarrghhhhhh on Twitter

Squishy eye keyboard? Anyone? Made with @Raspberry_Pi @pimoroni’s Explorer HAT Pro and a pile of stuff from @Poundland 😂👀‼️ https://t.co/qLfpLLiXqZ

Yeah…the line between frightening and funny is never thinner than on Halloween.

Make and share this Halloween!

For more Halloween project ideas, check out our free resources including Scary ‘Spot the difference’ and the new Pioneers-inspired Pride and Prejudice‘ for zombies.

Halloween Pride and Prejudice Zombies Raspberry Pi

It is a truth universally acknowledged that a single man in possession of the zombie virus must be in want of braaaaaaains.

No matter whether you share your Halloween builds on Twitter, Facebook, G+, Instagram, or YouTube, we want to see them — make sure to tag us in your posts. We also have a comment section below this post, so go ahead and fill it with your ideas, links to completed projects, and general chat about the world of RasBOOrry Pi!

…sorry, that’s a hideous play on words. I apologise.

The post Twitter makers love Halloween appeared first on Raspberry Pi.

More Raspberry Pi labs in West Africa

Post Syndicated from Rachel Churcher original https://www.raspberrypi.org/blog/pi-based-ict-west-africa/

Back in May 2013, we heard from Dominique Laloux about an exciting project to bring Raspberry Pi labs to schools in rural West Africa. Until 2012, 75 percent of teachers there had never used a computer. The project has been very successful, and Dominique has been in touch again to bring us the latest news.

A view of the inside of the new Pi lab building

Preparing the new Pi labs building in Kuma Tokpli, Togo

Growing the project

Thanks to the continuing efforts of a dedicated team of teachers, parents and other supporters, the Centre Informatique de Kuma, now known as INITIC (from the French ‘INItiation aux TIC’), runs two Raspberry Pi labs in schools in Togo, and plans to open a third in December. The second lab was opened last year in Kpalimé, a town in the Plateaux Region in the west of the country.

Student using a Raspberry Pi computer

Using the new Raspberry Pi labs in Kpalimé, Togo

More than 400 students used the new lab intensively during the last school year. Dominique tells us more:

“The report made in early July by the seven teachers who accompanied the students was nothing short of amazing: the young people covered a very impressive number of concepts and skills, from the GUI and the file system, to a solid introduction to word processing and spreadsheets, and many other skills. The lab worked exactly as expected. Its 21 Raspberry Pis worked flawlessly, with the exception of a couple of SD cards that needed re-cloning, and a couple of old screens that needed to be replaced. All the Raspberry Pis worked without a glitch. They are so reliable!”

The teachers and students have enjoyed access to a range of software and resources, all running on Raspberry Pi 2s and 3s.

“Our current aim is to introduce the students to ICT using the Raspberry Pis, rather than introducing them to programming and electronics (a step that will certainly be considered later). We use Ubuntu Mate along with a large selection of applications, from LibreOffice, Firefox, GIMP, Audacity, and Calibre, to special maths, science, and geography applications. There are also special applications such as GnuCash and GanttProject, as well as logic games including PyChess. Since December, students also have access to a local server hosting Kiwix, Wiktionary (a local copy of Wikipedia in four languages), several hundred videos, and several thousand books. They really love it!”

Pi lab upgrade

This summer, INITIC upgraded the equipment in their Pi lab in Kuma Adamé, which has been running since 2014. 21 older model Raspberry Pis were replaced with Pi 2s and 3s, to bring this lab into line with the others, and encourage co-operation between the different locations.

“All 21 first-generation Raspberry Pis worked flawlessly for three years, despite the less-than-ideal conditions in which they were used — tropical conditions, dust, frequent power outages, etc. I brought them all back to Brussels, and they all still work fine. The rationale behind the upgrade was to bring more computing power to the lab, and also to have the same equipment in our two Raspberry Pi labs (and in other planned installations).”

Students and teachers using the upgraded Pi labs in Kuma Adamé

Students and teachers using the upgraded Pi lab in Kuma Adamé

An upgrade of the organisation’s first lab, installed in 2012 in Kuma Tokpli, will be completed in December. This lab currently uses ‘retired’ laptops, which will be replaced with Raspberry Pis and peripherals. INITIC, in partnership with the local community, is also constructing a new building to house the upgraded technology, and the organisation’s third Raspberry Pi lab.

Reliable tech

Dominique has been very impressed with the performance of the Raspberry Pis since 2014.

“Our experience of three years, in two very different contexts, clearly demonstrates that the Raspberry Pi is a very convincing alternative to more ‘conventional’ computers for introducing young students to ICT where resources are scarce. I wish I could convince more communities in the world to invest in such ‘low cost, low consumption, low maintenance’ infrastructure. It really works!”

He goes on to explain that:

“Our goal now is to build at least one new Raspberry Pi lab in another Togolese school each year. That will, of course, depend on how successful we are at gathering the funds necessary for each installation, but we are confident we can convince enough friends to give us the financial support needed for our action.”

A desk with Raspberry Pis and peripherals

Reliable Raspberry Pis in the labs at Kpalimé

Get involved

We are delighted to see the Raspberry Pi being used to bring information technology to new teachers, students, and communities in Togo – it’s wonderful to see this project becoming established and building on its achievements. The mission of the Raspberry Pi Foundation is to put the power of digital making into the hands of people all over the world. Therefore, projects like this, in which people use our tech to fulfil this mission in places with few resources, are wonderful to us.

More information about INITIC and its projects can be found on its website. If you are interested in helping the organisation to meet its goals, visit the How to help page. And if you are involved with a project like this, bringing ICT, computer science, and coding to new places, please tell us about it in the comments below.

The post More Raspberry Pi labs in West Africa appeared first on Raspberry Pi.

Bringing Clean and Safe Drinking Water to Developing Countries

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/keeping-charity-water-data-safe/

image of a cup filling with water

If you’d like to read more about charity: water‘s use of Backblaze for Business, visit backblaze.com/charitywater/

charity: water  + Backblaze for Business

Considering that charity: water sends workers with laptop computers to rural communities in 24 countries around the world, it’s not surprising that computer backup is needed on every computer they have. It’s so essential that Matt Ward, System Administrator for charity: water, says it’s a standard part of employee on-boarding.

charity: water, based in New York City, is a non-profit organization that is working to bring clean water to the nearly one in ten people around the world who live without it — a situation that affects not only health, but education and income.

“We have people constantly traveling all over the world, so a cloud-based service makes sense whether the user is in New York or Malawi. Most of our projects and beneficiaries are in Sub Saharan Africa and Southern/Southeast Asia,” explains Matt. “Water scarcity and poor water quality are a problem here, and in so many countries around the world.”

charity: water in Rwanda

To achieve their mission, charity: water works through implementing organizations on the ground within the targeted communities. The people in these communities must spend hours every day walking to collect water for their families. It’s a losing proposition, as the time they spend walking takes away from education, earning money, and generally limits the opportunities for improving their lives.

charity: water began using Backblaze for Business before Matt came on a year ago. They started with a few licenses, but quickly decided to deploy Backblaze to every computer in the organization.

“We’ve lost computers plenty of times,” he says, “but, because of Backblaze, there’s never been a case where we lost the computer’s data.”

charity: water has about 80 staff computer users, and adds ten to twenty interns each season. Each staff member or intern has at least one computer. “Our IT department is two people, me and my director,” explains Matt, “and we have to support everyone, so being super simple to deploy is valuable to us.”

“When a new person joins us, we just send them an invitation to join the Group on Backblaze, and they’re all set. Their data is automatically backed up whenever they’re connected to the internet, and I can see their current status on the management console. [Backblaze] really nailed the user interface. You can show anyone the interface, even on their first day, and they get it because it’s simple and easy to understand.”

young girl drinkng clean water

One of the frequent uses for Backblaze for Business is when Matt off-boards users, such as all the interns at the end of the season. He starts a restore through the Backblaze admin console even before he has the actual computer. “I know I have a reliable archive in the restore from Backblaze, and it’s easier than doing it directly from the laptop.”

Matt is an enthusiastic user of the features designed for business users, especially Backblaze’s Groups feature, which has enabled charity: water to centralize billing and computer management for their worldwide team. Businesses can create groups to cluster job functions, employee locations, or any other criteria.

charity: water delivery clean water to children

“It saves me time to be able to see the status of any user’s backups, such as the last time the data was backed up” explains Matt. Before Backblaze, charity: water was writing documentation for workers, hoping they would follow backup protocols. Now, Matt knows what’s going on in real time — a valuable feature when the laptops are dispersed around the world.

“Backblaze for Business is an essential element in any organization’s IT continuity plan,” says Matt. “You need to be sure that there is a backup solution for your data should anything go wrong.”

To learn more about how charity: water uses Backblaze for Business, visit backblaze.com/charitywater/.

Matt Ward of charity: water

Matt Ward, System Administrator for charity: water

The post Bringing Clean and Safe Drinking Water to Developing Countries appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Google Signs Agreement to Tackle YouTube Piracy

Post Syndicated from Andy original https://torrentfreak.com/google-signs-unprecedented-agreement-to-tackle-youtube-piracy-170921/

Once upon a time, people complaining about piracy would point to the hundreds of piracy sites around the Internet. These days, criticism is just as likely to be leveled at Google-owned services.

YouTube, in particular, has come in for intense criticism, with the music industry complaining of exploitation of the DMCA in order to obtain unfair streaming rates from record labels. Along with streaming-ripping, this so-called Value Gap is one of the industry’s hottest topics.

With rightsholders seemingly at war with Google to varying degrees, news from France suggests that progress can be made if people sit down and negotiate.

According to local reports, Google and local anti-piracy outfit ALPA (l’Association de Lutte Contre la Piraterie Audiovisuelle) under the auspices of the CNC have signed an agreement to grant rightsholders direct access to content takedown mechanisms on YouTube.

YouTube has granted access to its Content ID systems to companies elsewhere for years but the new deal will see the system utilized by French content owners for the first time. It’s hoped that the access will result in infringing content being taken down or monetized more quickly than before.

“We do not want fraudsters to use our platforms to the detriment of creators,” said Carlo D’Asaro Biondo, Google’s President of Strategic Relationships in Europe, the Middle East and Africa.

The agreement, overseen by the Ministry of Culture, will see Google provide ALPA with financial support and rightsholders with essential training.

ALPA president Nicolas Seydoux welcomed the deal, noting that it symbolizes the “collapse of the wall of incomprehension” that previously existed between France’s rightsholders and the Internet search giant.

The deal forms part of the French government’s “Plan of Action Against Piracy”, in which it hopes to crack down on infringement in various ways, including tackling the threat of pirate sites, better promotion of services offering legitimate content, and educating children “from an early age” on the need to respect copyright.

“The fight against piracy is the great challenge of the new century in the cultural sphere,” said France’s Minister of Culture, Françoise Nyssen.

“I hope this is just the beginning of a process. It will require other agreements with rights holders and other platforms, as well as at the European level.”

According to NextInpact, the Google agreement will eventually encompass the downgrading of infringing content in search results as part of the Trusted Copyright Removal Program. A similar system is already in place in the UK.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Can an Army of Bitcoin “Bounty Hunters” Deter Pirates?

Post Syndicated from Ernesto original https://torrentfreak.com/can-an-army-of-bitcoin-bounty-hunters-deter-pirates-170917/

When we first heard of the idea to use Bitcoin bounties to track down pirated content online, we scratched our heads.

Snitching on copyright infringers is not a new concept, but the idea of instant cash rewards though cryptocurrency was quite novel.

In theory, it’s pretty straightforward. Content producers can add a unique identifying watermark into movies, eBooks, or other digital files before they’re circulated. When these somehow leak to the public, the bounty hunters use the watermark to claim their Bitcoin, alerting the owner in the process.

This helps to spot leaks early on, even on networks where automated tools don’t have access, and identify the source at the same time.

Two years have passed and it looks like the idea was no fluke. Custos, the South African company that owns the technology, has various copyright holders on board and recently announced a new partnership with book publisher Erudition Digital.

With help from anti-piracy outfit Digimarc, the companies will add identifying watermarks to eBook releases, counting on the bounty hunters to keep an eye out for leaks. These bounty hunters don’t have to be anti-piracy experts. On the contrary, pirates are more than welcome to help out.

“The Custos approach is revolutionary in that it attacks the economy of piracy by targeting uploaders rather than downloaders, turning downloaders into an early detection network,” the companies announced a few days ago.

“The result is pirates turn on one another, sowing seeds of distrust amongst their communities. As a result, the Custos system is capable of penetrating hard-to-reach places such as the dark web, peer-to-peer networks, and even email.”



Devon Weston, Director of Market Development for Digimarc Guardian, believes that this approach is the next level in anti-piracy efforts. It complements the automated detection tools that have been available in the past by providing access to hard-to-reach places.

“Together, this suite of products represents the next generation in technical measures against eBook piracy,” Weston commented on the partnership.

TorrentFreak reached out to Custos COO Fred Lutz to find out what progress the company has made in recent years. We were informed that they have been protecting thousands of copies every month, ranging from pre-release movie content to eBooks.

At the moment the company works with a selected group of “bounty hunters,” but they plan to open the extraction tool to the public in the near future, so everyone can join in.

“So far we have carefully seeded the free bounty extractor tool in relevant communities with great success. However, in the next phase, we will open the bounty hunting to the general public. We are just careful not to grow the bounty hunting community faster than the number of bounties in the wild require,” Lutz tells us.

The Bitcoin bounties themselves vary in size based on the specific use case. For a movie screener, they are typically anything between $10 and $50. However, for the most sensitive content, they can be $100 or more.

“We can also adjust the bounty over time based on the customer’s needs. A low-quality screener that was very sensitive prior to cinematic release does not require as large a bounty after cam-rips becomes available,” Lutz notes.

Thus far, roughly 50 Bitcoin bounties have been claimed. Some of these were planted by Custos themselves, as an incentive for the bounty hunters. Not a very high number, but that doesn’t mean that it’s not working.

“While this number might seem a bit small compared to the number of copies we protect, our aim is first and foremost not to detect leaks, but to pose a credible threat of quick detection and being caught.”

People who receive content protected by Custos are made aware of the watermarks, which may make them think twice about sharing it. If that’s the case, then it’s having effect without any bounties being claimed.

The question remains how many people will actively help to spot bounties. The success of the system largely depends on volunteers, and not all pirates are eager to rat on the people that provide free content.

On the other hand, there’s also room to abuse the system. In theory, people could claim the bounties on their own eBooks and claim that they’ve lost their e-reader. That would be fraud, of course, but since the bounties are in Bitcoin this isn’t easy to prove.

That brings us to the final question. What happens of a claimed bounty identifies a leaker? Custos admits that this alone isn’t enough evidence to pursue a legal case, but the measures that are taken in response are up to the copyright holders.

“A claim of a bounty is never a sufficient legal proof of piracy, instead, it is an invaluable first piece of evidence on which a legal case could be built if the client so requires. Legal prosecution is definitely not always the best approach to dealing with leaks,” Lutz says.

Time will tell if the Bitcoin bounty approach works…

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Streaming Service iflix Buys Shows Based on Piracy Data

Post Syndicated from Ernesto original https://torrentfreak.com/streaming-service-iflix-buys-shows-based-on-piracy-data-170819/

When major movie and TV companies discuss piracy they often mention the massive losses incurred as a result of unauthorized downloads and streams.

However, this unofficial market also offers a valuable pool of often publicly available data on the media consumption habits of a relatively young generation.

Many believe that piracy is in part a market signal showing copyright holders what consumers want. This makes piracy statistics key business intelligence, which some companies have started to realize.

Netflix, for example, previously said that their offering is partly based on what shows do well on BitTorrent networks and other pirate sites. In addition, the streaming service also uses piracy to figure out how much they can charge in a country. They are not alone.

Other major entertainment companies also keep a close eye on piracy, using this data to their advantage. This includes the Asia-based streaming portal iFlix, which recently secured $133 million in funding and boasts to have over five million users.

Iflix co-founder Patrick Grove says that his company actively uses piracy numbers to determine what content they acquire. The data reveal what is popular locally, and help to give viewers the TV-shows and movies they’re most interested in.

“We looked at piracy data in every market,” Grove informed CNBC’s Managing Asia, which doesn’t stop at looking at a few torrent download numbers.

Representatives from the Asian company actually went out on the streets to buy pirated DVDs from street vendors. In addition, iflix also received help from local Internet providers which shared a variety of streaming data.

TorrentFreak reached out to the streaming service to get more details about their data gathering techniques. One of the main partners to measure online piracy is the German company TECXIPIO, which is known to actively monitor BitTorrent traffic.

The company also maintains a close relationship with Internet providers that offer further insight, including streaming data, to determine which titles work best in each market.

While analyzing the different sets of data, the streaming service was surprised to see the diversity in different regions as well as the ever-changing consumer demand.

“Through looking at the Top 20 pirated DVDs in every market we are live in, we were surprised to find the amount of pirated K-drama content. In Ghana for example, the number one pirated title is K-drama series called ‘Legend of the Blue Sea’,” an iflix spokesperson told us.

Iflix believes that piracy data is superior to other market intelligence. Before rolling out its service in Saudi Arabia the company made a list of the 1,000 most popular shows and used that to its advantage.

While there is a lot of piracy in emerging markets, iflix doesn’t think that people are not willing to pay for entertainment. It just has to be available for a decent price, and that’s where they come in.

“We believe that people in emerging markets do not actively want to steal content, they do so because there is no better alternative,” the company informs us.

“As consumers become more connected, gaining access to information and cultural influences on a global scale, they want to be entertained at a world-class standard. We set out with the aim of offering an alternative that is better than piracy; by providing unlimited access to high-quality, world-class entertainment, all at the price of pirated DVD.”

There is no doubt that iflix is ambitious, and that it’s willing to employ some unusual tactics to grow its userbase. The company is quite optimistic about the future as well, judging from its co-founder’s prediction that it will welcome its billionth viewer in a few years.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Nazis, are bad

Post Syndicated from Eevee original https://eev.ee/blog/2017/08/13/nazis-are-bad/

Anonymous asks:

Could you talk about something related to the management/moderation and growth of online communities? IOW your thoughts on online community management, if any.

I think you’ve tweeted about this stuff in the past so I suspect you have thoughts on this, but if not, again, feel free to just blog about … anything 🙂

Oh, I think I have some stuff to say about community management, in light of recent events. None of it hasn’t already been said elsewhere, but I have to get this out.

Hopefully the content warning is implicit in the title.


I am frustrated.

I’ve gone on before about a particularly bothersome phenomenon that hurts a lot of small online communities: often, people are willing to tolerate the misery of others in a community, but then get up in arms when someone pushes back. Someone makes a lot of off-hand, off-color comments about women? Uses a lot of dog-whistle terms? Eh, they’re not bothering anyone, or at least not bothering me. Someone else gets tired of it and tells them to knock it off? Whoa there! Now we have the appearance of conflict, which is unacceptable, and people will turn on the person who’s pissed off — even though they’ve been at the butt end of an invisible conflict for who knows how long. The appearance of peace is paramount, even if it means a large chunk of the population is quietly miserable.

Okay, so now, imagine that on a vastly larger scale, and also those annoying people who know how to skirt the rules are Nazis.


The label “Nazi” gets thrown around a lot lately, probably far too easily. But when I see a group of people doing the Hitler salute, waving large Nazi flags, wearing Nazi armbands styled after the SS, well… if the shoe fits, right? I suppose they might have flown across the country to join a torch-bearing mob ironically, but if so, the joke is going way over my head. (Was the murder ironic, too?) Maybe they’re not Nazis in the sense that the original party doesn’t exist any more, but for ease of writing, let’s refer to “someone who espouses Nazi ideology and deliberately bears a number of Nazi symbols” as, well, “a Nazi”.

This isn’t a new thing, either; I’ve stumbled upon any number of Twitter accounts that are decorated in Nazi regalia. I suppose the trouble arises when perfectly innocent members of the alt-right get unfairly labelled as Nazis.

But hang on; this march was called “Unite the Right” and was intended to bring together various far right sub-groups. So what does their choice of aesthetic say about those sub-groups? I haven’t heard, say, alt-right coiner Richard Spencer denounce the use of Nazi symbology — extra notable since he was fucking there and apparently didn’t care to discourage it.


And so begins the rule-skirting. “Nazi” is definitely overused, but even using it to describe white supremacists who make not-so-subtle nods to Hitler is likely to earn you some sarcastic derailment. A Nazi? Oh, so is everyone you don’t like and who wants to establish a white ethno state a Nazi?

Calling someone a Nazi — or even a white supremacist — is an attack, you see. Merely expressing the desire that people of color not exist is perfectly peaceful, but identifying the sentiment for what it is causes visible discord, which is unacceptable.

These clowns even know this sort of thing and strategize around it. Or, try, at least. Maybe it wasn’t that successful this weekend — though flicking through Charlottesville headlines now, they seem to be relatively tame in how they refer to the ralliers.

I’m reminded of a group of furries — the alt-furries — who have been espousing white supremacy and wearing red armbands with a white circle containing a black… pawprint. Ah, yes, that’s completely different.


So, what to do about this?

Ignore them” is a popular option, often espoused to bullied children by parents who have never been bullied, shortly before they resume complaining about passive-aggressive office politics. The trouble with ignoring them is that, just like in smaller communitiest, they have a tendency to fester. They take over large chunks of influential Internet surface area like 4chan and Reddit; they help get an inept buffoon elected; and then they start to have torch-bearing rallies and run people over with cars.

4chan illustrates a kind of corollary here. Anyone who’s steeped in Internet Culture™ is surely familiar with 4chan; I was never a regular visitor, but it had enough influence that I was still aware of it and some of its culture. It was always thick with irony, which grew into a sort of ironic detachment — perhaps one of the major sources of the recurring online trope that having feelings is bad — which proceeded into ironic racism.

And now the ironic racism is indistinguishable from actual racism, as tends to be the case. Do they “actually” “mean it”, or are they just trying to get a rise out of people? What the hell is unironic racism if not trying to get a rise out of people? What difference is there to onlookers, especially as they move to become increasingly involved with politics?

It’s just a joke” and “it was just a thoughtless comment” are exceptionally common defenses made by people desperate to preserve the illusion of harmony, but the strain of overt white supremacy currently running rampant through the US was built on those excuses.


The other favored option is to debate them, to defeat their ideas with better ideas.

Well, hang on. What are their ideas, again? I hear they were chanting stuff like “go back to Africa” and “fuck you, faggots”. Given that this was an overtly political rally (and again, the Nazi fucking regalia), I don’t think it’s a far cry to describe their ideas as “let’s get rid of black people and queer folks”.

This is an underlying proposition: that white supremacy is inherently violent. After all, if the alt-right seized total political power, what would they do with it? If I asked the same question of Democrats or Republicans, I’d imagine answers like “universal health care” or “screw over poor people”. But people whose primary goal is to have a country full of only white folks? What are they going to do, politely ask everyone else to leave? They’re invoking the memory of people who committed genocide and also tried to take over the fucking world. They are outright saying, these are the people we look up to, this is who we think had a great idea.

How, precisely, does one defeat these ideas with rational debate?

Because the underlying core philosophy beneath all this is: “it would be good for me if everything were about me”. And that’s true! (Well, it probably wouldn’t work out how they imagine in practice, but it’s true enough.) Consider that slavery is probably fantastic if you’re the one with the slaves; the issue is that it’s reprehensible, not that the very notion contains some kind of 101-level logical fallacy. That’s probably why we had a fucking war over it instead of hashing it out over brunch.

…except we did hash it out over brunch once, and the result was that slavery was still allowed but slaves only counted as 60% of a person for the sake of counting how much political power states got. So that’s how rational debate worked out. I’m sure the slaves were thrilled with that progress.


That really only leaves pushing back, which raises the question of how to push back.

And, I don’t know. Pushing back is much harder in spaces you don’t control, spaces you’re already struggling to justify your own presence in. For most people, that’s most spaces. It’s made all the harder by that tendency to preserve illusory peace; even the tamest request that someone knock off some odious behavior can be met by pushback, even by third parties.

At the same time, I’m aware that white supremacists prey on disillusioned young white dudes who feel like they don’t fit in, who were promised the world and inherited kind of a mess. Does criticism drive them further away? The alt-right also opposes “political correctness”, i.e. “not being a fucking asshole”.

God knows we all suck at this kind of behavior correction, even within our own in-groups. Fandoms have become almost ridiculously vicious as platforms like Twitter and Tumblr amplify individual anger to deafening levels. It probably doesn’t help that we’re all just exhausted, that every new fuck-up feels like it bears the same weight as the last hundred combined.

This is the part where I admit I don’t know anything about people and don’t have any easy answers. Surprise!


The other alternative is, well, punching Nazis.

That meme kind of haunts me. It raises really fucking complicated questions about when violence is acceptable, in a culture that’s completely incapable of answering them.

America’s relationship to violence is so bizarre and two-faced as to be almost incomprehensible. We worship it. We have the biggest military in the world by an almost comical margin. It’s fairly mainstream to own deadly weapons for the express stated purpose of armed revolution against the government, should that become necessary, where “necessary” is left ominously undefined. Our movies are about explosions and beating up bad guys; our video games are about explosions and shooting bad guys. We fantasize about solving foreign policy problems by nuking someone — hell, our talking heads are currently in polite discussion about whether we should nuke North Korea and annihilate up to twenty-five million people, as punishment for daring to have the bomb that only we’re allowed to have.

But… violence is bad.

That’s about as far as the other side of the coin gets. It’s bad. We condemn it in the strongest possible terms. Also, guess who we bombed today?

I observe that the one time Nazis were a serious threat, America was happy to let them try to take over the world until their allies finally showed up on our back porch.

Maybe I don’t understand what “violence” means. In a quest to find out why people are talking about “leftist violence” lately, I found a National Review article from May that twice suggests blocking traffic is a form of violence. Anarchists have smashed some windows and set a couple fires at protests this year — and, hey, please knock that crap off? — which is called violence against, I guess, Starbucks. Black Lives Matter could be throwing a birthday party and Twitter would still be abuzz with people calling them thugs.

Meanwhile, there’s a trend of murderers with increasingly overt links to the alt-right, and everyone is still handling them with kid gloves. First it was murders by people repeating their talking points; now it’s the culmination of a torches-and-pitchforks mob. (Ah, sorry, not pitchforks; assault rifles.) And we still get this incredibly bizarre both-sides-ism, a White House that refers to the people who didn’t murder anyone as “just as violent if not more so“.


Should you punch Nazis? I don’t know. All I know is that I’m extremely dissatisfied with discourse that’s extremely alarmed by hypothetical punches — far more mundane than what you’d see after a sporting event — but treats a push for ethnic cleansing as a mere difference of opinion.

The equivalent to a punch in an online space is probably banning, which is almost laughable in comparison. It doesn’t cause physical harm, but it is a use of concrete force. Doesn’t pose quite the same moral quandary, though.

Somewhere in the middle is the currently popular pastime of doxxing (doxxxxxxing) people spotted at the rally in an attempt to get them fired or whatever. Frankly, that skeeves me out, though apparently not enough that I’m directly chastizing anyone for it.


We aren’t really equipped, as a society, to deal with memetic threats. We aren’t even equipped to determine what they are. We had a fucking world war over this, and now people are outright saying “hey I’m like those people we went and killed a lot in that world war” and we give them interviews and compliment their fashion sense.

A looming question is always, what if they then do it to you? What if people try to get you fired, to punch you for your beliefs?

I think about that a lot, and then I remember that it’s perfectly legal to fire someone for being gay in half the country. (Courts are currently wrangling whether Title VII forbids this, but with the current administration, I’m not optimistic.) I know people who’ve been fired for coming out as trans. I doubt I’d have to look very far to find someone who’s been punched for either reason.

And these aren’t even beliefs; they’re just properties of a person. You can stop being a white supremacist, one of those people yelling “fuck you, faggots”.

So I have to recuse myself from this asinine question, because I can’t fairly judge the risk of retaliation when it already happens to people I care about.

Meanwhile, if a white supremacist does get punched, I absolutely still want my tax dollars to pay for their universal healthcare.


The same wrinkle comes up with free speech, which is paramount.

The ACLU reminds us that the First Amendment “protects vile, hateful, and ignorant speech”. I think they’ve forgotten that that’s a side effect, not the goal. No one sat down and suggested that protecting vile speech was some kind of noble cause, yet that’s how we seem to be treating it.

The point was to avoid a situation where the government is arbitrarily deciding what qualifies as vile, hateful, and ignorant, and was using that power to eliminate ideas distasteful to politicians. You know, like, hypothetically, if they interrogated and jailed a bunch of people for supporting the wrong economic system. Or convicted someone under the Espionage Act for opposing the draft. (Hey, that’s where the “shouting fire in a crowded theater” line comes from.)

But these are ideas that are already in the government. Bannon, a man who was chair of a news organization he himself called “the platform for the alt-right”, has the President’s ear! How much more mainstream can you get?

So again I’m having a little trouble balancing “we need to defend the free speech of white supremacists or risk losing it for everyone” against “we fairly recently were ferreting out communists and the lingering public perception is that communists are scary, not that the government is”.


This isn’t to say that freedom of speech is bad, only that the way we talk about it has become fanatical to the point of absurdity. We love it so much that we turn around and try to apply it to corporations, to platforms, to communities, to interpersonal relationships.

Look at 4chan. It’s completely public and anonymous; you only get banned for putting the functioning of the site itself in jeopardy. Nothing is stopping a larger group of people from joining its politics board and tilting sentiment the other way — except that the current population is so odious that no one wants to be around them. Everyone else has evaporated away, as tends to happen.

Free speech is great for a government, to prevent quashing politics that threaten the status quo (except it’s a joke and they’ll do it anyway). People can’t very readily just bail when the government doesn’t like them, anyway. It’s also nice to keep in mind to some degree for ubiquitous platforms. But the smaller you go, the easier it is for people to evaporate away, and the faster pure free speech will turn the place to crap. You’ll be left only with people who care about nothing.


At the very least, it seems clear that the goal of white supremacists is some form of destabilization, of disruption to the fabric of a community for purely selfish purposes. And those are the kinds of people you want to get rid of as quickly as possible.

Usually this is hard, because they act just nicely enough to create some plausible deniability. But damn, if someone is outright telling you they love Hitler, maybe skip the principled hand-wringing and eject them.

Approved Reseller programme launch PLUS more Pi Zero resellers

Post Syndicated from Mike Buffham original https://www.raspberrypi.org/blog/approved-reseller/

Ever since the launch of the first Raspberry Pi back in 2012, one thing that has been critical to us is to make our products easy to buy in as many countries as possible.

Buying a Raspberry Pi is certainly much simpler nowadays than it was when we were just starting out. Nevertheless, we want to go even further, and so today we are introducing an Approved Reseller programme. With this programme, we aim to recognise those resellers that represent Raspberry Pi products well, and make purchasing them easy for their customers.

The Raspberry Pi Approved Reseller programme

We’re launching the programme in eleven countries today: the UK, Ireland, France, Spain, Portugal, Italy, the Netherlands, Belgium, Luxembourg, Greece and South Africa. Over the next few weeks, you will see us expand it to at least 50 countries.

We will link to the Approved Resellers’ websites directly from our Products page via the “Buy now” button. For customers who want to buy for business applications we have also added a “Buy for business” button. After clicking it, you will be able to select your country from a drop down menu. Doing so will link you directly to the local websites of our two licensed partners, Premier Farnell and Electrocomponents.

Our newest Raspberry Pi Zero resellers

On top of this we are also adding 6 new Raspberry Pi Zero resellers, giving 13 countries direct access to the Raspberry Pi Zero for the first time. We are particularly excited that these countries include Brazil and India, since they both have proved difficult to supply in the past.

The full list of new resellers is:

Hong Kong and China

Brazil

Raspberry Pi Brazil

India

Raspberry Pi India

Czech Republic and Slovakia

Raspberry Pi Czech Republic and Slovakia

Slovenia, Croatia, Serbia and Bosnia-Herzegovina

Raspberry Pi Slovenia, Croatia, Serbia and Bosnia

Romania, Bulgaria and Hungary

Raspberry Pi Romania, Bulgaria and Hungary

Mexico

Raspberry Pi Mexico

The post Approved Reseller programme launch PLUS more Pi Zero resellers appeared first on Raspberry Pi.

AIMS Desktop 2017.1 released

Post Syndicated from corbet original https://lwn.net/Articles/725712/rss

The AIMS desktop is a
Debian-derived distribution aimed at mathematical and scientific use. This
project’s first public release, based on Debian 9, is now available.
It is a GNOME-based distribution with a bunch of add-on software.
It is maintained by AIMS (The African Institute for Mathematical
Sciences), a pan-African network of centres of excellence enabling Africa’s
talented students to become innovators driving the continent’s scientific,
educational and economic self-sufficiency.

AWS Big Data Blog Month in Review: April 2017

Post Syndicated from Derek Young original https://aws.amazon.com/blogs/big-data/aws-big-data-blog-month-in-review-april-2017/

Another month of big data solutions on the Big Data Blog. Please take a look at our summaries below and learn, comment, and share. Thank you for reading!

NEW POSTS

Amazon QuickSight Spring Announcement: KPI Charts, Export to CSV, AD Connector, and More! 
In this blog post, we share a number of new features and enhancements in Amazon Quicksight. You can now create key performance indicator (KPI) charts, define custom ranges when importing Microsoft Excel spreadsheets, export data to comma separated value (CSV) format, and create aggregate filters for SPICE data sets. In the Enterprise Edition, we added an additional option to connect to your on-premises Active Directory using AD Connector. 

Securely Analyze Data from Another AWS Account with EMRFS
Sometimes, data to be analyzed is spread across buckets owned by different accounts. In order to ensure data security, appropriate credentials management needs to be in place. This is especially true for large enterprises storing data in different Amazon S3 buckets for different departments. This post shows how you can use a custom credentials provider to access S3 objects that cannot be accessed by the default credentials provider of EMRFS.

Querying OpenStreetMap with Amazon Athena
This post explains how anyone can use Amazon Athena to quickly query publicly available OSM data stored in Amazon S3 (updated weekly) as an AWS Public Dataset. Imagine that you work for an NGO interested in improving knowledge of and access to health centers in Africa. You might want to know what’s already been mapped, to facilitate the production of maps of surrounding villages, and to determine where infrastructure investments are likely to be most effective.

Build a Real-time Stream Processing Pipeline with Apache Flink on AWS
This post outlines a reference architecture for a consistent, scalable, and reliable stream processing pipeline that is based on Apache Flink using Amazon EMR, Amazon Kinesis, and Amazon Elasticsearch Service. An AWSLabs GitHub repository provides the artifacts that are required to explore the reference architecture in action. Resources include a producer application that ingests sample data into an Amazon Kinesis stream and a Flink program that analyses the data in real time and sends the result to Amazon ES for visualization.

Manage Query Workloads with Query Monitoring Rules in Amazon Redshift
Amazon Redshift is a powerful, fully managed data warehouse that can offer significantly increased performance and lower cost in the cloud. However, queries which hog cluster resources (rogue queries) can affect your experience. In this post, you learn how query monitoring rules can help spot and act against such queries. This, in turn, can help you to perform smooth business operations in supporting mixed workloads to maximize cluster performance and throughput.

Amazon QuickSight Now Supports Audit Logging with AWS CloudTrail
In this post, we announce support for AWS CloudTrail in Amazon QuickSight, which allows logging of QuickSight events across an AWS account. Whether you have an enterprise setting or a small team scenario, this integration will allow QuickSight administrators to accurately answer questions such as who last changed an analysis, or who has connected to sensitive data. With CloudTrail, administrators have better governance, auditing and risk management of their QuickSight usage.

Near Zero Downtime Migration from MySQL to DynamoDB
This post introduces two methods of seamlessly migrating data from MySQL to DynamoDB, minimizing downtime and converting the MySQL key design into one more suitable for NoSQL.


Want to learn more about Big Data or Streaming Data? Check out our Big Data and Streaming data educational pages.

Leave a comment below to let us know what big data topics you’d like to see next on the AWS Big Data Blog.

250,000 Pi Zero W units shipped and more Pi Zero distributors announced

Post Syndicated from Mike Buffham original https://www.raspberrypi.org/blog/pi-zero-distributors-annoucement/

This week, just nine weeks after its launch, we will ship the 250,000th Pi Zero W into the market. As well as hitting that pretty impressive milestone, today we are announcing 13 new Raspberry Pi Zero distributors, so you should find it much easier to get hold of a unit.

Raspberry Pi Zero W and Case - Pi Zero distributors

This significantly extends the reach we can achieve with Pi Zero and Pi Zero W across the globe. These new distributors serve Australia and New Zealand, Italy, Malaysia, Japan, South Africa, Poland, Greece, Switzerland, Denmark, Sweden, Norway, and Finland. We are also further strengthening our network in the USA, Canada, and Germany, where demand continues to be very high.

Pi Zero W - Pi Zero distributors

A common theme on the Raspberry Pi forums has been the difficulty of obtaining a Zero or Zero W in a number of countries. This has been most notable in the markets which are furthest away from Europe or North America. We are hoping that adding these new distributors will make it much easier for Pi-fans across the world to get hold of their favourite tiny computer.

We know there are still more markets to cover, and we are continuing to work with other potential partners to improve the Pi Zero reach. Watch this space for even further developments!

Who are the new Pi Zero Distributors?

Check the icons below to find the distributor that’s best for you!

Australia and New Zealand

Core Electronics - New Raspberry Pi Zero Distributors

PiAustralia Raspberry Pi - New Raspberry Pi Zero Distributors

South Africa

PiShop - New Raspberry Pi Zero Distributors

Please note: Pi Zero W is not currently available to buy in South Africa, as we are waiting for ICASA Certification.

Denmark, Sweden, Finland, and Norway

JKollerup - New Raspberry Pi Zero Distributors

electro:kit - New Raspberry Pi Zero Distributors

Germany and Switzerland

sertronics - New Raspberry Pi Zero Distributors

pi-shop - New Raspberry Pi Zero Distributors

Poland

botland - New Raspberry Pi Zero Distributors

Greece

nettop - New Raspberry Pi Zero Distributors

Italy

Japan

ksy - New Raspberry Pi Zero Distributors

switch science - New Raspberry Pi Zero Distributors

Please note: Pi Zero W is not currently available to buy in Japan as we are waiting for TELEC Certification.

Malaysia

cytron - New Raspberry Pi Zero Distributors

Please note: Pi Zero W is not currently available to buy in Malaysia as we are waiting for SIRIM Certification

Canada and USA

buyapi - New Raspberry Pi Zero Distributors

Get your Pi Zero

For full product details, plus a complete list of Pi Zero distributors, visit the Pi Zero W page.

Awesome feature image GIF credit goes to Justin Mezzell

The post 250,000 Pi Zero W units shipped and more Pi Zero distributors announced appeared first on Raspberry Pi.

Querying OpenStreetMap with Amazon Athena

Post Syndicated from Seth Fitzsimmons original https://aws.amazon.com/blogs/big-data/querying-openstreetmap-with-amazon-athena/

This is a guest post by Seth Fitzsimmons, member of the 2017 OpenStreetMap US board of directors. Seth works with clients including the Humanitarian OpenStreetMap Team, Mapzen, the American Red Cross, and World Bank to craft innovative geospatial solutions.

OpenStreetMap (OSM) is a free, editable map of the world, created and maintained by volunteers and available for use under an open license. Companies and non-profits like Mapbox, Foursquare, Mapzen, the World Bank, the American Red Cross and others use OSM to provide maps, directions, and geographic context to users around the world.

In the 12 years of OSM’s existence, editors have created and modified several billion features (physical things on the ground like roads or buildings). The main PostgreSQL database that powers the OSM editing interface is now over 2TB and includes historical data going back to 2007. As new users join the open mapping community, more and more valuable data is being added to OpenStreetMap, requiring increasingly powerful tools, interfaces, and approaches to explore its vastness.

This post explains how anyone can use Amazon Athena to quickly query publicly available OSM data stored in Amazon S3 (updated weekly) as an AWS Public Dataset. Imagine that you work for an NGO interested in improving knowledge of and access to health centers in Africa. You might want to know what’s already been mapped, to facilitate the production of maps of surrounding villages, and to determine where infrastructure investments are likely to be most effective.

Note: If you run all the queries in this post, you will be charged approximately $1 based on the number of bytes scanned. All queries used in this post can be found in this GitHub gist.

What is OpenStreetMap?

As an open content project, regular OSM data archives are made available to the public via planet.openstreetmap.org in a few different formats (XML, PBF). This includes both snapshots of the current state of data in OSM as well as historical archives.

Working with “the planet” (as the data archives are referred to) can be unwieldy. Because it contains data spanning the entire world, the size of a single archive is on the order of 50 GB. The format is bespoke and extremely specific to OSM. The data is incredibly rich, interesting, and useful, but the size, format, and tooling can often make it very difficult to even start the process of asking complex questions.

Heavy users of OSM data typically download the raw data and import it into their own systems, tailored for their individual use cases, such as map rendering, driving directions, or general analysis. Now that OSM data is available in the Apache ORC format on Amazon S3, it’s possible to query the data using Athena without even downloading it.

How does Athena help?

You can use Athena along with data made publicly available via OSM on AWS. You don’t have to learn how to install, configure, and populate your own server instances and go through multiple steps to download and transform the data into a queryable form. Thanks to AWS and partners, a regularly updated copy of the planet file (available within hours of OSM’s weekly publishing schedule) is hosted on S3 and made available in a format that lends itself to efficient querying using Athena.

Asking questions with Athena involves registering the OSM planet file as a table and making SQL queries. That’s it. Nothing to download, nothing to configure, nothing to ingest. Athena distributes your queries and returns answers within seconds, even while querying over 9 years and billions of OSM elements.

You’re in control. S3 provides high availability for the data and Athena charges you per TB of data scanned. Plus, we’ve gone through the trouble of keeping scanning charges as small as possible by transcoding OSM’s bespoke format as ORC. All the hard work of transforming the data into something highly queryable and making it publicly available is done; you just need to bring some questions.

Registering Tables

The OSM Public Datasets consist of three tables:

  • planet
    Contains the current versions of all elements present in OSM.
  • planet_history
    Contains a historical record of all versions of all elements (even those that have been deleted).
  • changesets
    Contains information about changesets in which elements were modified (and which have a foreign key relationship to both the planet and planet_history tables).

To register the OSM Public Datasets within your AWS account so you can query them, open the Athena console (make sure you are using the us-east-1 region) to paste and execute the following table definitions:

planet

CREATE EXTERNAL TABLE planet (
  id BIGINT,
  type STRING,
  tags MAP<STRING,STRING>,
  lat DECIMAL(9,7),
  lon DECIMAL(10,7),
  nds ARRAY<STRUCT<ref: BIGINT>>,
  members ARRAY<STRUCT<type: STRING, ref: BIGINT, role: STRING>>,
  changeset BIGINT,
  timestamp TIMESTAMP,
  uid BIGINT,
  user STRING,
  version BIGINT
)
STORED AS ORCFILE
LOCATION 's3://osm-pds/planet/';

planet_history

CREATE EXTERNAL TABLE planet_history (
    id BIGINT,
    type STRING,
    tags MAP<STRING,STRING>,
    lat DECIMAL(9,7),
    lon DECIMAL(10,7),
    nds ARRAY<STRUCT<ref: BIGINT>>,
    members ARRAY<STRUCT<type: STRING, ref: BIGINT, role: STRING>>,
    changeset BIGINT,
    timestamp TIMESTAMP,
    uid BIGINT,
    user STRING,
    version BIGINT,
    visible BOOLEAN
)
STORED AS ORCFILE
LOCATION 's3://osm-pds/planet-history/';

changesets

CREATE EXTERNAL TABLE changesets (
    id BIGINT,
    tags MAP<STRING,STRING>,
    created_at TIMESTAMP,
    open BOOLEAN,
    closed_at TIMESTAMP,
    comments_count BIGINT,
    min_lat DECIMAL(9,7),
    max_lat DECIMAL(9,7),
    min_lon DECIMAL(10,7),
    max_lon DECIMAL(10,7),
    num_changes BIGINT,
    uid BIGINT,
    user STRING
)
STORED AS ORCFILE
LOCATION 's3://osm-pds/changesets/';

 

Under the Hood: Extract, Transform, Load

So, what happens behind the scenes to make this easier for you? In a nutshell, the data is transcoded from the OSM PBF format into Apache ORC.

There’s an AWS Lambda function (running every 15 minutes, triggered by CloudWatch Events) that watches planet.openstreetmap.org for the presence of weekly updates (using rsync). If that function detects that a new version has become available, it submits a set of AWS Batch jobs to mirror, transcode, and place it as the “latest” version. Code for this is available at osm-pds-pipelines GitHub repo.

To facilitate the transcoding into a format appropriate for Athena, we have produced an open source tool, OSM2ORC. The tool also includes an Osmosis plugin that can be used with complex filtering pipelines and outputs an ORC file that can be uploaded to S3 for use with Athena, or used locally with other tools from the Hadoop ecosystem.

What types of questions can OpenStreetMap answer?

There are many uses for OpenStreetMap data; here are three major ones and how they may be addressed using Athena.

Case Study 1: Finding Local Health Centers in West Africa

When the American Red Cross mapped more than 7,000 communities in West Africa in areas affected by the Ebola epidemic as part of the Missing Maps effort, they found themselves in a position where collecting a wide variety of data was both important and incredibly beneficial for others. Accurate maps play a critical role in understanding human communities, especially for populations at risk. The lack of detailed maps for West Africa posed a problem during the 2014 Ebola crisis, so collecting and producing data around the world has the potential to improve disaster responses in the future.

As part of the data collection, volunteers collected locations and information about local health centers, something that will facilitate treatment in future crises (and, more importantly, on a day-to-day basis). Combined with information about access to markets and clean drinking water and historical experiences with natural disasters, this data was used to create a vulnerability index to select communities for detailed mapping.

For this example, you find all health centers in West Africa (many of which were mapped as part of Missing Maps efforts). This is something that healthsites.io does for the public (worldwide and editable, based on OSM data), but you’re working with the raw data.

Here’s a query to fetch information about all health centers, tagged as nodes (points), in Guinea, Sierra Leone, and Liberia:

SELECT * from planet
WHERE type = 'node'
  AND tags['amenity'] IN ('hospital', 'clinic', 'doctors')
  AND lon BETWEEN -15.0863 AND -7.3651
  AND lat BETWEEN 4.3531 AND 12.6762;

Buildings, as “ways” (polygons, in this case) assembled from constituent nodes (points), can also be tagged as medical facilities. In order to find those, you need to reassemble geometries. Here you’re taking the average of all nodes that make up a building (which will be the approximate center point, which is close enough for this purpose). Here is a query that finds both buildings and points that are tagged as medical facilities:

-- select out nodes and relevant columns
WITH nodes AS (
  SELECT
    type,
    id,
    tags,
    lat,
    lon
  FROM planet
  WHERE type = 'node'
),
-- select out ways and relevant columns
ways AS (
  SELECT
    type,
    id,
    tags,
    nds
  FROM planet
  WHERE type = 'way'
    AND tags['amenity'] IN ('hospital', 'clinic', 'doctors')
),
-- filter nodes to only contain those present within a bounding box
nodes_in_bbox AS (
  SELECT *
  FROM nodes
  WHERE lon BETWEEN -15.0863 AND -7.3651
    AND lat BETWEEN 4.3531 AND 12.6762
)
-- find ways intersecting the bounding box
SELECT
  ways.type,
  ways.id,
  ways.tags,
  AVG(nodes.lat) lat,
  AVG(nodes.lon) lon
FROM ways
CROSS JOIN UNNEST(nds) AS t (nd)
JOIN nodes_in_bbox nodes ON nodes.id = nd.ref
GROUP BY (ways.type, ways.id, ways.tags)
UNION ALL
SELECT
  type,
  id,
  tags,
  lat,
  lon
FROM nodes_in_bbox
WHERE tags['amenity'] IN ('hospital', 'clinic', 'doctors');

You could go a step further and query for additional tags included with these (for example, opening_hours) and use that as a metric for measuring “completeness” of the dataset and to focus on additional data to collect (and locations to fill out).

Case Study 2: Generating statistics about mapathons

OSM has a history of holding mapping parties. Mapping parties are events where interested people get together and wander around outside, gathering and improving information about sites (and sights) that they pass. Another form of mapping party is the mapathon, which brings together armchair and desk mappers to focus on improving data in another part of the world.

Mapathons are a popular way of enlisting volunteers for Missing Maps, a collaboration between many NGOs, education institutions, and civil society groups that aims to map the most vulnerable places in the developing world to support international and local NGOs and individuals. One common way that volunteers participate is to trace buildings and roads from aerial imagery, providing baseline data that can later be verified by Missing Maps staff and volunteers working in the areas being mapped.

(Image and data from the American Red Cross)

Data collected during these events lends itself to a couple different types of questions. People like competition, so Missing Maps has developed a set of leaderboards that allow people to see where they stand relative to other mappers and how different groups compare. To facilitate this, hashtags (such as #missingmaps) are included in OSM changeset comments. To do similar ad hoc analysis, you need to query the list of changesets, filter by the presence of certain hashtags in the comments, and group things by username.

Now, find changes made during Missing Maps mapathons at George Mason University (using the #gmu hashtag):

SELECT *
FROM changesets
WHERE regexp_like(tags['comment'], '(?i)#gmu');

This includes all tags associated with a changeset, which typically include a mapper-provided comment about the changes made (often with additional hashtags corresponding to OSM Tasking Manager projects) as well as information about the editor used, imagery referenced, etc.

If you’re interested in the number of individual users who have mapped as part of the Missing Maps project, you can write a query such as:

SELECT COUNT(DISTINCT uid)
FROM changesets
WHERE regexp_like(tags['comment'], '(?i)#missingmaps');

25,610 people (as of this writing)!

Back at GMU, you’d like to know who the most prolific mappers are:

SELECT user, count(*) AS edits
FROM changesets
WHERE regexp_like(tags['comment'], '(?i)#gmu')
GROUP BY user
ORDER BY count(*) DESC;

Nice job, BrokenString!

It’s also interesting to see what types of features were added or changed. You can do that by using a JOIN between the changesets and planet tables:

SELECT planet.*, changesets.tags
FROM planet
JOIN changesets ON planet.changeset = changesets.id
WHERE regexp_like(changesets.tags['comment'], '(?i)#gmu');


Using this as a starting point, you could break down the types of features, highlight popular parts of the world, or do something entirely different.

Case Study 3: Building Condition

With building outlines having been produced by mappers around (and across) the world, local Missing Maps volunteers (often from local Red Cross / Red Crescent societies) go around with Android phones running OpenDataKit  and OpenMapKit to verify that the buildings in question actually exist and to add additional information about them, such as the number of stories, use (residential, commercial, etc.), material, and condition.

This data can be used in many ways: it can provide local geographic context (by being included in map source data) as well as facilitate investment by development agencies such as the World Bank.

Here are a collection of buildings mapped in Dhaka, Bangladesh:

(Map and data © OpenStreetMap contributors)

For NGO staff to determine resource allocation, it can be helpful to enumerate and show buildings in varying conditions. Building conditions within an area can be a means of understanding where to focus future investments.

Querying for buildings is a bit more complicated than working with points or changesets. Of the three core OSM element types—node, way, and relation, only nodes (points) have geographic information associated with them. Ways (lines or polygons) are composed of nodes and inherit vertices from them. This means that ways must be reconstituted in order to effectively query by bounding box.

This results in a fairly complex query. You’ll notice that this is similar to the query used to find buildings tagged as medical facilities above. Here you’re counting buildings in Dhaka according to building condition:

-- select out nodes and relevant columns
WITH nodes AS (
  SELECT
    id,
    tags,
    lat,
    lon
  FROM planet
  WHERE type = 'node'
),
-- select out ways and relevant columns
ways AS (
  SELECT
    id,
    tags,
    nds
  FROM planet
  WHERE type = 'way'
),
-- filter nodes to only contain those present within a bounding box
nodes_in_bbox AS (
  SELECT *
  FROM nodes
  WHERE lon BETWEEN 90.3907 AND 90.4235
    AND lat BETWEEN 23.6948 AND 23.7248
),
-- fetch and expand referenced ways
referenced_ways AS (
  SELECT
    ways.*,
    t.*
  FROM ways
  CROSS JOIN UNNEST(nds) WITH ORDINALITY AS t (nd, idx)
  JOIN nodes_in_bbox nodes ON nodes.id = nd.ref
),
-- fetch *all* referenced nodes (even those outside the queried bounding box)
exploded_ways AS (
  SELECT
    ways.id,
    ways.tags,
    idx,
    nd.ref,
    nodes.id node_id,
    ARRAY[nodes.lat, nodes.lon] coordinates
  FROM referenced_ways ways
  JOIN nodes ON nodes.id = nd.ref
  ORDER BY ways.id, idx
)
-- query ways matching the bounding box
SELECT
  count(*),
  tags['building:condition']
FROM exploded_ways
GROUP BY tags['building:condition']
ORDER BY count(*) DESC;


Most buildings are unsurveyed (125,000 is a lot!), but of those that have been, most are average (as you’d expect). If you were to further group these buildings geographically, you’d have a starting point to determine which areas of Dhaka might benefit the most.

Conclusion

OSM data, while incredibly rich and valuable, can be difficult to work with, due to both its size and its data model. In addition to the time spent downloading large files to work with locally, time is spent installing and configuring tools and converting the data into more queryable formats. We think Amazon Athena combined with the ORC version of the planet file, updated on a weekly basis, is an extremely powerful and cost-effective combination. This allows anyone to start querying billions of records with simple SQL in no time, giving you the chance to focus on the analysis, not the infrastructure.

To download the data and experiment with it using other tools, the latest OSM ORC-formatted file is available via OSM on AWS at s3://osm-pds/planet/planet-latest.orcs3://osm-pds/planet-history/history-latest.orc, and s3://osm-pds/changesets/changesets-latest.orc.

We look forward to hearing what you find out!

A Serverless Authentication System by Jumia

Post Syndicated from Bryan Liston original https://aws.amazon.com/blogs/compute/a-serverless-authentication-system-by-jumia/

Jumia Group is an ecosystem of nine different companies operating in 22 different countries in Africa. Jumia employs 3000 people and serves 15 million users/month.

Want to secure and centralize millions of user accounts across Africa? Shut down your servers! Jumia Group unified and centralized customer authentication on nine digital services platforms, operating in 22 (and counting) countries in Africa, totaling over 120 customer and merchant facing applications. All were unified into a custom Jumia Central Authentication System (JCAS), built in a timely fashion and designed using a serverless architecture.

In this post, we give you our solution overview. For the full technical implementation, see the Jumia Central Authentication System post on the Jumia Tech blog.

The challenge

A group-level initiative was started to centralize authentication for all Jumia users in all countries for all companies. But it was impossible to unify the operational databases for the different companies. Each company had its own user database with its own technological implementation. Each company alone had yet to unify the logins for their own countries. The effects of deduplicating all user accounts were yet to be determined but were considered to be large. Finally, there was no team responsible for manage this new project, given that a new dedicated infrastructure would be needed.

With these factors in mind, we decided to design this project as a serverless architecture to eliminate the need for infrastructure and a dedicated team. AWS was immediately considered as the best option for its level of
service, intelligent pricing model, and excellent serverless services.

The goal was simple. For all user accounts on all Jumia Group websites:

  • Merge duplicates that might exist in different companies and countries
  • Create a unique login (username and password)
  • Enrich the user profile in this centralized system
  • Share the profile with all Jumia companies

Requirements

We had the following initial requirements while designing this solution on the AWS platform:

  • Secure by design
  • Highly available via multimaster replication
  • Single login for all platforms/countries
  • Minimal performance impact
  • No admin overhead

Chosen technologies

We chose the following AWS services and technologies to implement our solution.

Amazon API Gateway

Amazon API Gateway is a fully managed service, making it really simple to set up an API. It integrates directly with AWS Lambda, which was chosen as our endpoint. It can be easily replicated to other regions, using Swagger import/export.

AWS Lambda

AWS Lambda is the base of serverless computing, allowing you to run code without worrying about infrastructure. All our code runs on Lambda functions using Python; some functions are
called from the API Gateway, others are scheduled like cron jobs.

Amazon DynamoDB

Amazon DynamoDB
is a highly scalable, NoSQL database with a good API and a clean pricing model. It has great scalability as well as high availability, and fits the serverless model we aimed for with JCAS.

AWS KMS

AWS KMS was chosen as a key manager to perform envelope encryption. It’s a simple and secure key manager with multiple encryption functionalities.

Envelope encryption

Oversimplifying envelope encryption is when you encrypt a key rather than the data itself. That encrypted key, which was used to encrypt the data, may now be stored with the data itself on your persistence layer since it doesn’t decrypt the data if compromised. For more information, see How Envelope Encryption Works with Supported AWS Services.

Envelope encryption was chosen given that master keys have a 4 KB limit for data to be encrypted or decrypted.

Amazon SQS

Amazon SQS is an inexpensive queuing service with dynamic scaling, 14-day retention availability, and easy management. It’s the perfect choice for our needs, as we use queuing systems only as a
fallback when saving data to remote regions fails. All the features needed for those fallback cases were covered.

JWT

We also use JSON web tokens for encoding and signing communications between JCAS and company servers. It’s another layer of security for data in transit.

Data structure

Our database design was pretty straightforward. DynamoDB records can be accessed by the primary key and allow access using secondary indexes as well. We created a UUID for each user on JCAS, which is used as a primary key
on DynamoDB.

Most data must be encrypted so we use a single field for that data. On the other hand, there’s data that needs to be stored in separate fields as they need to be accessed from the code without decryption for lookups or basic checks. This indexed data was also stored, as a hash or plain, outside the main ciphered blob store.

We used a field for each searchable data piece:

  • Id (primary key)
  • main phone (indexed)
  • main email (indexed)
  • account status
  • document timestamp

To hold the encrypted data we use a data dictionary with two main dictionaries, info with the user’s encrypted data and keys with the key for each AWS KMS region to decrypt the info blob.
Passwords are stored in two dictionaries, old_hashes contains the legacy hashes from the origin systems and secret holds the user’s JCAS password.

Here’s an example:
data_at_rest

Security

Security is like an onion, it needs to be layered. That’s what we did when designing this solution. Our design makes all of our data unreadable at each layer of this solution, while easing our compliance needs.

A field called data stores all personal information from customers. It’s encrypted using AES256-CBC with a key generated and managed by AWS KMS. A new data key is used for each transaction. For communication between companies and the API, we use API Keys, TLS and JWT, in the body to ensure that the post is signed and verified.

Data flow

Our second requirement on JCAS was system availability, so we designed some data pipelines. These pipelines allow multi-region replication, which evades collision using idempotent operations on all endpoints. The only technology added to the stack was Amazon SQS. On SQS queues, we place all the items we aren’t able to replicate at the time of the client’s request.

JCAS may have inconsistencies between regions, caused by network or region availability; by using SQS, we have workers that synchronize all regions as soon as possible.

Example with two regions

We have three Lambda functions and two SQS queues, where:

1) A trigger Lambda function is called for the DynamoDB stream. Upon changes to the user’s table, it tries to write directly to another DynamoDB table in the second region, falling back to writing to two SQS queues.

fig1

2) A scheduled Lambda function (cron-style) checks a SQS queue for items and tries writing them to the DynamoDB table that potentially failed.

fig2

3) A cron-style Lambda function checks the SQS queue, calling KMS for any items, and fixes the issue.

fig3

The following diagram shows the full infrastrucure (for clarity, this diagram leaves out KMS recovery).

CAS Full Diagram

Results

Upon going live, we noticed a minor impact in our response times. Note the brown legend in the images below.

lat_1

This was a cold start, as the infrastructure started to get hot, response time started to converge. On 27 April, we were almost at the initial 500ms.
lat_2

It kept steady on values before JCAS went live (≈500ms).
lat_3

As of the writing of this post, our response time kept improving (dev changed the method name and we changed the subdomain name).
lat_4

Customer login used to take ≈500ms and it still takes ≈500ms with JCAS. Those times have improved as other components changed inside our code.

Lessons learned

  • Instead of the standard cross-region replication, creating your own DynamoDB cross-region replication might be a better fit for you, if your data and applications allow it.
  • Take some time to tweak the Lambda runtime memory. Remember that it’s billed per 100ms, so it saves you money if you have it run near a round number.
  • KMS takes away the problem of key management with great security by design. It really simplifies your life.
  • Always check the timestamp before operating on data. If it’s invalid, save the money by skipping further KMS, DynamoDB, and Lambda calls. You’ll love your systems even more.
  • Embrace the serverless paradigm, even if it looks complicated at first. It will save your life further down the road when your traffic bursts or you want to find an engineer who knows your whole system.

Next steps

We are going to leverage everything we’ve done so far to implement SSO in Jumia. For a future project, we are already testing OpenID connect with DynamoDB as a backend.

Conclusion

We did a mindset revolution in many ways.
Not only we went completely serverless as we started storing critical info on the cloud.
On top of this we also architectured a system where all user data is decoupled between local systems and our central auth silo.
Managing all of these critical systems became far more predictable and less cumbersome than we thought possible.
For us this is the proof that good and simple designs are the best features to look out for when sketching new systems.

If we were to do this again, we would do it in exactly the same way.

Daniel Loureiro (SecOps) & Tiago Caxias (DBA) – Jumia Group