Tag Archives: toy

When tiny robot COZMO met our tiny Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/cozmo-raspberry-pi/

Hack your COZMO for ultimate control, using a Raspberry Pi and this tutorial from Instructables user Marcelo ‘mjrovai’ Rovai.

Cozmo – RPi 4

Full integration The complete tutorial can be found here: https://www.instructables.com/id/When-COZMO-the-Robot-Meets-the-Raspberry-Pi/

COZMO

COZMO is a Python-programmable robot from ANKI that boasts a variety of on-board sensors and a camera, and that can be controlled via an app or via code. To get an idea of how COZMO works, check out this rather excitable video from the wonderful Mayim Bialik.

The COZMO SDK

COZMO’s creators, ANKI, provide a Software Development Kit (SDK) so that users can get the most out of their COZMO. This added functionality is a great opportunity for budding coders to dive into hacking their toys, without the risk of warranty voiding/upsetting parents/not being sure how to put a toy back together again.

By the way, I should point out that this is in no way a sponsored blog post. I just think COZMO is ridiculously cute…because tiny robots are adorable, no matter their intentions.

Raspberry Pi Doctor Who Cybermat

Marcelo Rovai + Raspberry Pi + COZMO

For his Instructables tutorial, Marcelo connected an Android device running the COZMO app to his Raspberry Pi 3 via USB. Once USB debugging had been enabled on his device, he installed the Android Debug Bridge (ADB) to the Raspberry Pi. Then his Pi was able to recognise the connected Android device, and from there, Marcelo moved on to installing the SDK, including support for COZMO’s camera.

COZMO Raspberry Pi

The SDK comes with pre-installed examples, allowing users to try out the possibilities of the kit, such as controlling what COZMO says by editing a Python script.

Cozmo and RPi

Hello World The complete tutorial can be found here: https://www.instructables.com/id/When-COZMO-the-Robot-Meets-the-Raspberry-Pi/

Do more with COZMO

Marcelo’s tutorial offers more example code for users of the COZMO SDK, along with the code to run the LED button game featured in the video above, and tips on utilising the SDK to take full advantage of COZMO. Check it out here on Instructables, and visit his website for even more projects.

The post When tiny robot COZMO met our tiny Raspberry Pi appeared first on Raspberry Pi.

Eevee gained 2791 experience points

Post Syndicated from Eevee original https://eev.ee/blog/2018/01/15/eevee-gained-2791-experience-points/

Eevee grew to level 31!

A year strongly defined by mixed success! Also, a lot of video games.

I ran three game jams, resulting in a total of 157 games existing that may not have otherwise, which is totally mindblowing?!

For GAMES MADE QUICK???, glip and I made NEON PHASE, a short little exploratory platformer. Honestly, I should give myself more credit for this and the rest of the LÖVE games I’ve based on the same codebase — I wove a physics engine (and everything else!) from scratch and it has held up remarkably well for a variety of different uses.

I successfully finished an HD version of Isaac’s Descent using my LÖVE engine, though it doesn’t have anything new over the original and I’ve only released it as a tech demo on Patreon.

For Strawberry Jam (NSFW!) we made fox flux (slightly NSFW!), which felt like a huge milestone: the first game where I made all the art! I mean, not counting Isaac’s Descent, which was for a very limited platform. It’s a pretty arbitrary milestone, yes, but it feels significant. I’ve been working on expanding the game into a longer and slightly less buggy experience, but the art is taking the longest by far. I must’ve spent weeks on player sprites alone.

We then set about working on Bolthaven, a sequel of sorts to NEON PHASE, and got decently far, and then abandond it. Oops.

We then started a cute little PICO-8 game, and forgot about it. Oops.

I was recruited to help with Chaos Composer, a more ambitious game glip started with someone else in Unity. I had to get used to Unity, and we squabbled a bit, but the game is finally about at the point where it’s “playable” and “maps” can be designed? It’s slightly on hold at the moment while we all finish up some other stuff, though.

We made a birthday game for two of our friends whose birthdays were very close together! Only they got to see it.

For Ludum Dare 38, we made Lunar Depot 38, a little “wave shooter” or whatever you call those? The AI is pretty rough, seeing as this was the first time I’d really made enemies and I had 72 hours to figure out how to do it, but I still think it’s pretty fun to play and I love the circular world.

I made Roguelike Simulator as an experiment with making something small and quick with a simple tool, and I had a lot of fun! I definitely want to do more stuff like this in the future.

And now we’re working on a game about Star Anise, my cat’s self-insert, which is looking to have more polish and depth than anything we’ve done so far! We’ve definitely come a long way in a year.

Somewhere along the line, I put out a call for a “potluck” project, where everyone would give me sprites of a given size without knowing what anyone else had contributed, and I would then make a game using only those sprites. Unfortunately, that stalled a few times: I tried using the Phaser JS library, but we didn’t get along; I tried LÖVE, but didn’t know where to go with the game; and then I decided to use this as an experiment with procedural generation, and didn’t get around to it. I still feel bad that everyone did work for me and I didn’t follow through, but I don’t know whether this will ever become a game.

veekun, alas, consumed months of my life. I finally got Sun and Moon loaded, but it took weeks of work since I was basically reinventing all the tooling we’d ever had from scratch, without even having most of that tooling available as a reference. It was worth it in the end, at least: Ultra Sun and Ultra Moon only took a few days to get loaded. But veekun itself is still missing some obvious Sun/Moon features, and the whole site needs an overhaul, and I just don’t know if I want to dedicate that much time to it when I have so much other stuff going on that’s much more interesting to me right now.

I finally turned my blog into more of a website, giving it a neat front page that lists a bunch of stuff I’ve done. I made a release category at last, though I’m still not quite in the habit of using it.

I wrote some blog posts, of course! I think the most interesting were JavaScript got better while I wasn’t looking and Object models. I was also asked to write a couple pieces for money for a column that then promptly shut down.

On a whim, I made a set of Eevee mugshots for Doom, which I think is a decent indication of my (pixel) art progress over the year?

I started idchoppers, a Doom parsing and manipulation library written in Rust, though it didn’t get very far and I’ve spent most of the time fighting with Rust because it won’t let me implement all my extremely bad ideas. It can do a couple things, at least, like flip maps very quickly and render maps to SVG.

I did toy around with music a little, but not a lot.

I wrote two short twines for Flora. They’re okay. I’m working on another; I think it’ll be better.

I didn’t do a lot of art overall, at least compared to the two previous years; most of my art effort over the year has gone into fox flux, which requires me to learn a whole lot of things. I did dip my toes into 3D modelling, most notably producing my current Twitter banner as well as this cool Star Anise animation. I wouldn’t mind doing more of that; maybe I’ll even try to make a low-poly pixel-textured 3D game sometime.

I restarted my book with a much better concept, though so far I’ve only written about half a chapter. Argh. I see that the vast majority of the work was done within the span of a single week, which is bad since that means I only worked on it for a week, but good since that means I can actually do a pretty good amount of work in only a week. I also did a lot of squabbling with tooling, which is hopefully mostly out of the way now.

My computer broke? That was an exciting week.


A lot of stuff, but the year as a whole still feels hit or miss. All the time I spent on veekun feels like a black void in the middle of the year, which seems like a good sign that I maybe don’t want to pour even more weeks into it in the near future.

Mostly, I want to do: more games, more art, more writing, more music.

I want to try out some tiny game making tools and make some tiny games with them — partly to get exposure to different things, partly to get more little ideas out into the world regularly, and partly to get more practice at letting myself have ideas. I have a couple tools in mind and I guess I’ll aim at a microgame every two months or so? I’d also like to finish the expanded fox flux by the end of the year, of course, though at the moment I can’t even gauge how long it might take.

I seriously lapsed on drawing last year, largely because fox flux pixel art took me so much time. So I want to draw more, and I want to get much faster at pixel art. It would probably help if I had a more concrete goal for drawing, so I might try to draw some short comics and write a little visual novel or something, which would also force me to aim for consistency.

I want to work on my book more, of course, but I also want to try my hand at a bit more fiction. I’ve had a blast writing dialogue for our games! I just shy away from longer-form writing for some reason — which seems ridiculous when a large part of my audience found me through my blog. I do think I’ve had some sort of breakthrough in the last month or two; I suddenly feel a good bit more confident about writing in general and figuring out what I want to say? One recent post I know I wrote in a single afternoon, which virtually never happens because I keep rewriting and rearranging stuff. Again, a visual novel would be a good excuse to practice writing fiction without getting too bogged down in details.

And, ah, music. I shy heavily away from music, since I have no idea what I’m doing, and also I seem to spend a lot of time fighting with tools. (Surprise.) I tried out SunVox for the first time just a few days ago and have been enjoying it quite a bit for making sound effects, so I might try it for music as well. And once again, visual novel background music is a pretty low-pressure thing to compose for. Hell, visual novels are small games, too, so that checks all the boxes. I guess I’ll go make a visual novel.

Here’s to twenty gayteen!

Marvellous retrofitted home assistants

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/retrofitted-home-assistants/

As more and more digital home assistants are appearing on the consumer market, it’s not uncommon to see the towering Amazon Echo or sleek Google Home when visiting friends or family. But we, the maker community, are rarely happy unless our tech stands out from the rest. So without further ado, here’s a roundup of some fantastic retrofitted home assistant projects you can recreate and give pride of place in your kitchen, on your bookshelf, or wherever else you’d like to talk to your virtual, disembodied PA.

Google AIY Robot Conversion

Turned an 80s Tomy Mr Money into a little Google AIY / Raspberry Pi based assistant.

Matt ‘Circuitbeard’ Brailsford’s Tomy Mr Money Google AIY Assistant is just one of many home-brew home assistants makers have built since the release of APIs for Amazon Alexa and Google Home. Here are some more…

Teddy Ruxpin

Oh Teddy, how exciting and mysterious you were when I unwrapped you back in the mideighties. With your awkwardly moving lips and twitching eyelids, you were the cream of the crop of robotic toys! How was I to know that during my thirties, you would become augmented with home assistant software and suddenly instil within me a fear unlike any I’d felt before? (Save for my lifelong horror of ET…)

Alexa Ruxpin – Raspberry Pi & Alexa Powered Teddy Bear

Please watch: “DIY Fidget LED Display – Part 1” https://www.youtube.com/watch?v=FAZIc82Duzk -~-~~-~~~-~~-~- There are tons of virtual assistants out on the market: Siri, Ok Google, Alexa, etc. I had this crazy idea…what if I made the virtual assistant real…kinda. I decided to take an old animatronic teddy bear and hack it so that it ran Amazon Alexa.

Several makers around the world have performed surgery on Teddy to install a Raspberry Pi within his stomach and integrate him with Amazon Alexa Voice or Google’s AIY Projects Voice kit. And because these makers are talented, they’ve also managed to hijack Teddy’s wiring to make his lips move in time with his responses to your commands. Freaky…

Speaking of freaky: check out Zack’s Furlexa — an Amazon Alexa Furby that will haunt your nightmares.

Give old tech new life

Devices that were the height of technology when you purchased them may now be languishing in your attic collecting dust. With new and improved versions of gadgets and gizmos being released almost constantly, it is likely that your household harbours a spare whosit or whatsit which you can dismantle and give a new Raspberry Pi heart and purpose.

Take, for example, Martin Mander’s Google Pi intercom. By gutting and thoroughly cleaning a vintage intercom, Martin fashioned a suitable housing the Google AIY Projects Voice kit to create a new home assistant for his house:

1986 Google Pi Intercom

This is a 1986 Radio Shack Intercom that I’ve converted into a Google Home style device using a Raspberry Pi and the Google AIY (Artificial Intelligence Yourself) kit that came free with the MagPi magazine (issue 57). It uses the Google Assistant to answer questions and perform actions, using IFTTT to integrate with smart home accessories and other web services.

Not only does this build look fantastic, it’s also a great conversation starter for any visitors who had a similar device during the eighties.

Also take a look at Martin’s 1970s Amazon Alexa phone for more nostalgic splendour.

Put it in a box

…and then I’ll put that box inside of another box, and then I’ll mail that box to myself, and when it arrives…

A GIF from the emperors new groove - Raspberry Pi Home Assistant

A GIF. A harmless, little GIF…and proof of the comms team’s obsession with The Emperor’s New Groove.

You don’t have to be fancy when it comes to housing your home assistant. And often, especially if you’re working with the smaller people in your household, the results of a simple homespun approach are just as delightful.

Here are Hannah and her dad Tom, explaining how they built a home assistant together and fit it inside an old cigar box:

Raspberry Pi 3 Amazon Echo – The Alexa Kids Build!

My 7 year old daughter and I decided to play around with the Raspberry Pi and build ourselves an Amazon Echo (Alexa). The video tells you about what we did and the links below will take you to all the sites we used to get this up and running.

Also see the Google AIY Projects Voice kit — the cardboard box-est of home assistant boxes.

Make your own home assistant

And now it’s your turn! I challenge you all (and also myself) to create a home assistant using the Raspberry Pi. Whether you decide to fit Amazon Alexa inside an old shoebox or Google Home inside your sister’s Barbie, I’d love to see what you create using the free home assistant software available online.

Check out these other home assistants for Raspberry Pi, and keep an eye on our blog to see what I manage to create as part of the challenge.

Ten virtual house points for everyone who shares their build with us online, either in the comments below or by tagging us on your social media account.

The post Marvellous retrofitted home assistants appeared first on Raspberry Pi.

The Pi Towers Secret Santa Babbage

Post Syndicated from Mark Calleja original https://www.raspberrypi.org/blog/secret-santa-babbage/

Tired of pulling names out of a hat for office Secret Santa? Upgrade your festive tradition with a Raspberry Pi, thermal printer, and everybody’s favourite microcomputer mascot, Babbage Bear.

Raspberry Pi Babbage Bear Secret Santa

The name’s Santa. Secret Santa.

It’s that time of year again, when the cosiness gets turned up to 11 and everyone starts thinking about jolly fat men, reindeer, toys, and benevolent home invasion. At Raspberry Pi, we’re running a Secret Santa pool: everyone buys a gift for someone else in the office. Obviously, the person you buy for has to be picked in secret and at random, or the whole thing wouldn’t work. With that in mind, I created Secret Santa Babbage to do the somewhat mundane task of choosing gift recipients. This could’ve just been done with some names in a hat, but we’re Raspberry Pi! If we don’t make a Python-based Babbage robot wearing a jaunty hat and programmed to spread Christmas cheer, who will?

Secret Santa Babbage

Ho ho ho!

Mecha-Babbage Xmas shenanigans

The script the robot runs is pretty basic: a list of names entered as comma-separated strings is shuffled at the press of a GPIO button, then a name is popped off the end and stored as a variable. The name is matched to a photo of the person stored on the Raspberry Pi, and a thermal printer pinched from Alex’s super awesome PastyCam (blog post forthcoming, maybe) prints out the picture and name of the person you will need to shower with gifts at the Christmas party. (Well, OK — with one gift. No more than five quid’s worth. Nothing untoward.) There’s also a redo function, just in case you pick yourself: press another button and the last picked name — still stored as a variable — is appended to the list again, which is shuffled once more, and a new name is popped off the end.

Secret Santa Babbage prototyping

Prototyping!

As the build was a bit of a rush job undertaken at the request of our ‘Director of Vibe’ Emily, there are a few things I’d like to improve about this functionality that I didn’t get around to — more on that later. To add some extra holiday spirit to the project at the last minute, I used Pygame to play a WAV file of Santa’s jolly laugh while Babbage chooses a name for you. The file is included in the GitHub repo along with everything else, because ‘tis the season, etc., etc.

Secret Santa Babbage prototyping

Editor’s note: Considering these desk adornments, Mark’s Secret Santa gift-giver has a lot to go on.

Writing the code for Xmas Mecha-Babbage was fairly straightforward, though it uses some tricky bits for managing the thermal printer. You’ll need to install the drivers to make it go, as well as the CUPS package for managing the print hosting. You can find instructions for these things here, thanks to the wonderful Adafruit crew. Also, for reasons I couldn’t fathom, this will all only work on a Pi 2 and not a Pi 3, as there are some compatibility issues with the thermal printer otherwise. (I also tested the script on a Pi Zero W…no dice.)

Building a Christmassy throne

The hardest (well, fiddliest) parts of making the whole build were constructing the throne and wiring the bear. Using MakerCase, Inkscape, a bit of ingenuity, and a laser cutter, I was able to rig up a Christmassy plywood throne which has a hole through the seat so I could run the wires down from Babbage and to the Pi inside. I finished the throne by rubbing a couple of fingers of beeswax into it; as well as making the wood shine just a little bit and protecting it against getting wet, this had the added bonus of making it smell awesome.

Secret Santa Babbage inside

Next year’s iteration will be mulled wine–scented.

I next soldered two LEDs to some lengths of wire, and then ran the wires through holes at the top of the throne and down the back along a small channel I had carved with a narrow chisel to connect them to the Pi’s GPIO pins. The green LED will remain on as long as Babbage is running his program, and the red one will light up while he is processing your request. Once the red LED goes off again, the next person can have a go. I also laser-cut a final piece of wood to overlay the back of Babbage’s Xmas throne and cover the wiring a bit.

Creating a Xmas cyborg bear

Taking two 6 mm tactile buttons, I clipped the spiky metal legs off one side of each (the buttons were going into a stuffed christmas toy, after all) and soldered a length of wire to each of the remaining legs. Next, I made a small incision into Babbage with my trusty Swiss army knife (in a place that actually made me cringe a little) and fed the buttons up into his paws. At some point in this process I was standing in the office wrestling with the bear and muttering to myself, which elicited some very strange looks from my colleagues.

Secret Santa Babbage throne

Poor Babbage…

One thing to note here is to make sure the wires remain attached at the solder points while you push them up into Babbage’s paws. The first time I tried it, I snapped one of my connections and had to start again. It helped to remove some stuffing like a tunnel and then replace it afterward. Moreover, you can use your fingertip to support the joints as you poke the wire in. Finally, a couple of squirts of hot glue to keep Babbage’s furry cheeks firmly on the seat, and done!

Secret Santa Babbage

Next year: Game of Thrones–inspired candy cane throne

The Secret Santa Babbage masterpiece

The whole build process was the perfect holiday mix of cheerful and macabre, and while getting the thermal printer to work was a little time-consuming, the finished product definitely raised some smiles around the office and added a bit of interesting digital flavour to a staid office tradition. And it also helped people who are new to the office or from other branches of the Foundation to know for whom they will be buying a gift.

Secret Santa Babbage

Ready to dispense Christmas cheer!

There are a few ways in which I’ll polish this project before next year, such as having the script write the names to external text files to create a record that will persist in case of a reboot, and maybe having Secret Santa Babbage play you a random Christmas carol when you squeeze his paw instead of just laughing merrily every time. (I also thought about adding electric shocks for those people who are on the naughty list, but HR said no. Bah, humbug!)

Make your own

The code and laser cut plans for the whole build are available here. If you plan to make your own, let us know which stuffed toy you will be turning into a Secret Santa cyborg! And if you’ve been working on any other Christmas-themed Raspberry Pi projects, we’d like to see those too, so tag us on social media to share the festive maker cheer.

The post The Pi Towers Secret Santa Babbage appeared first on Raspberry Pi.

European Commission Steps Up Fight Against Online Piracy

Post Syndicated from Ernesto original https://torrentfreak.com/european-commission-steps-up-fight-against-online-piracy-171130/

The European Commission has had copyright issues at the top of its agenda for a while, resulting in several controversial proposals.

This week it presented a series of new measures to ensure that copyright holders are well protected, targeting both online piracy and counterfeit goods.

“Today we boost our collective ability to catch the ‘big fish’ behind fake goods and pirated content which harm our companies and our jobs – as well as our health and safety in areas such as medicines or toys,” Commissioner Elżbieta Bieńkowska announced.

The Commission notes that it’s stepping up the fight against counterfeiting and piracy. However, many of the proposals are not entirely new for those who follow anti-piracy issues around the globe.

One of the main goals is to focus on the people who facilitate copyright infringement, such as pirate site operators, and try to cut their revenue streams.

“The Commission seeks to deprive commercial-scale IP infringers of the revenue flows that make their criminal activity lucrative – this is the so-called ‘follow the money’ approach which focuses on the ‘big fish’ rather than individuals,” they write.

Instead of using legislation to reach this goal, the Commission prefers to continue its support for voluntary agreements between copyright holders and third-party services. This includes deals with advertising and payment services to cut their ties with pirate sites.

“Such agreements can lead to faster action against counterfeiting and piracy than court actions,” the Commission writes.

Another tool to fight piracy appears on the agenda for the first time. The European Commission notes that it will also support the quest for new anti-piracy initiatives, including the use of blockchain technology.

“Supporting industry-led initiatives to combat IP infringements, including work on Memoranda of Understanding and exploring the potential of new technologies such as blockchain to combat IP infringements in supply chains,” the suggestion reads.

No concrete examples were given but earlier this week, European Parliament member Brando Benifei wrote an article on the issue in Euractiv.

Benifei mentions that blockchain technology can help independent artists collect royalty payments without the need for middlemen. In a similar vein, blockchains can also be used to track the unauthorized distribution of works.

In addition to broadening the anti-piracy horizon, the European Commission also released a new guidance on how the current IPR Enforcement Directive (IPRED) should be interpreted, taking into account various recent developments, including landmark EU Court of Justice rulings.

The guidance explains how and when it’s appropriate to issue website blocking orders, for example. In general, blocking injunctions are warranted when they are proportional and aimed at preventing concrete infringements.

The comprehensive guidance also covers the issue of filtering. Interestingly, the Commission clarifies that third-party services can’t be required to “install and operate excessively broad, unspecific and expensive filtering systems.”

This appears to run counter to the mandatory piracy filters that were suggested as part of the copyright reform proposal.

However, the Commission notes that in some specific cases, hosting providers (e.g. YouTube) can be ordered to monitor uploads. This is in line with a recent communication which recommended that online services should implement measures to automatically detect and remove suspected illegal content.

While the new plans continue down the path of stronger copyright protections, not all rightsholders are happy. IFPI is glad that the main problems are highlighted, but would have liked to have seen more concrete plans.

“We are disappointed that despite the European Commission recognizing the need to modernize IPRED and years of evidence gathering, today’s result is merely guidance to EU Member State governments. Soft law does not give right holders the tools they need to take effective action against pirate services,” IFPI writes.

On the other side of the divide, opposition to the previously announced EU copyright reform plans continues as well. Earlier today a group of over 80 organizations urged EU member states to speak out against several controversial copyright proposals, including the upload filter.

“The signatories warn the Member states that the discussion around the Copyright Directive are on the verge of causing irreparable damage to our fundamental rights and freedoms, our economy and competitiveness, our education and research, our innovation and competition, our creativity and our culture,” they say.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN discounts, offers and coupons

Presenting AWS IoT Analytics: Delivering IoT Analytics at Scale and Faster than Ever Before

Post Syndicated from Tara Walker original https://aws.amazon.com/blogs/aws/launch-presenting-aws-iot-analytics/

One of the technology areas I thoroughly enjoy is the Internet of Things (IoT). Even as a child I used to infuriate my parents by taking apart the toys they would purchase for me to see how they worked and if I could somehow put them back together. It seems somehow I was destined to end up the tough and ever-changing world of technology. Therefore, it’s no wonder that I am really enjoying learning and tinkering with IoT devices and technologies. It combines my love of development and software engineering with my curiosity around circuits, controllers, and other facets of the electrical engineering discipline; even though an electrical engineer I can not claim to be.

Despite all of the information that is collected by the deployment of IoT devices and solutions, I honestly never really thought about the need to analyze, search, and process this data until I came up against a scenario where it became of the utmost importance to be able to search and query through loads of sensory data for an anomaly occurrence. Of course, I understood the importance of analytics for businesses to make accurate decisions and predictions to drive the organization’s direction. But it didn’t occur to me initially, how important it was to make analytics an integral part of my IoT solutions. Well, I learned my lesson just in time because this re:Invent a service is launching to make it easier for anyone to process and analyze IoT messages and device data.

 

Hello, AWS IoT Analytics!  AWS IoT Analytics is a fully managed service of AWS IoT that provides advanced data analysis of data collected from your IoT devices.  With the AWS IoT Analytics service, you can process messages, gather and store large amounts of device data, as well as, query your data. Also, the new AWS IoT Analytics service feature integrates with Amazon Quicksight for visualization of your data and brings the power of machine learning through integration with Jupyter Notebooks.

Benefits of AWS IoT Analytics

  • Helps with predictive analysis of data by providing access to pre-built analytical functions
  • Provides ability to visualize analytical output from service
  • Provides tools to clean up data
  • Can help identify patterns in the gathered data

Be In the Know: IoT Analytics Concepts

  • Channel: archives the raw, unprocessed messages and collects data from MQTT topics.
  • Pipeline: consumes messages from channels and allows message processing.
    • Activities: perform transformations on your messages including filtering attributes and invoking lambda functions advanced processing.
  • Data Store: Used as a queryable repository for processed messages. Provide ability to have multiple datastores for messages coming from different devices or locations or filtered by message attributes.
  • Data Set: Data retrieval view from a data store, can be generated by a recurring schedule. 

Getting Started with AWS IoT Analytics

First, I’ll create a channel to receive incoming messages.  This channel can be used to ingest data sent to the channel via MQTT or messages directed from the Rules Engine. To create a channel, I’ll select the Channels menu option and then click the Create a channel button.

I’ll name my channel, TaraIoTAnalyticsID and give the Channel a MQTT topic filter of Temperature. To complete the creation of my channel, I will click the Create Channel button.

Now that I have my Channel created, I need to create a Data Store to receive and store the messages received on the Channel from my IoT device. Remember you can set up multiple Data Stores for more complex solution needs, but I’ll just create one Data Store for my example. I’ll select Data Stores from menu panel and click Create a data store.

 

I’ll name my Data Store, TaraDataStoreID, and once I click the Create the data store button and I would have successfully set up a Data Store to house messages coming from my Channel.

Now that I have my Channel and my Data Store, I will need to connect the two using a Pipeline. I’ll create a simple pipeline that just connects my Channel and Data Store, but you can create a more robust pipeline to process and filter messages by adding Pipeline activities like a Lambda activity.

To create a pipeline, I’ll select the Pipelines menu option and then click the Create a pipeline button.

I will not add an Attribute for this pipeline. So I will click Next button.

As we discussed there are additional pipeline activities that I can add to my pipeline for the processing and transformation of messages but I will keep my first pipeline simple and hit the Next button.

The final step in creating my pipeline is for me to select my previously created Data Store and click Create Pipeline.

All that is left for me to take advantage of the AWS IoT Analytics service is to create an IoT rule that sends data to an AWS IoT Analytics channel.  Wow, that was a super easy process to set up analytics for IoT devices.

If I wanted to create a Data Set as a result of queries run against my data for visualization with Amazon Quicksight or integrate with Jupyter Notebooks to perform more advanced analytical functions, I can choose the Analyze menu option to bring up the screens to create data sets and access the Juypter Notebook instances.

Summary

As you can see, it was a very simple process to set up the advanced data analysis for AWS IoT. With AWS IoT Analytics, you have the ability to collect, visualize, process, query and store large amounts of data generated from your AWS IoT connected device. Additionally, you can access the AWS IoT Analytics service in a myriad of different ways; the AWS Command Line Interface (AWS CLI), the AWS IoT API, language-specific AWS SDKs, and AWS IoT Device SDKs.

AWS IoT Analytics is available today for you to dig into the analysis of your IoT data. To learn more about AWS IoT and AWS IoT Analytics go to the AWS IoT Analytics product page and/or the AWS IoT documentation.

Tara

Weekly roundup: VK Ultra

Post Syndicated from Eevee original https://eev.ee/dev/2017/11/27/weekly-roundup-vk-ultra/

  • fox flux: Cleaned up and committed the “heart get” overlay and worked on some more art for it. Diagnosed a very obscure physics problem, but didn’t come up with a good solution yet; physics is hard! Drew a very good tree trunk to use as a spawn point; also worked on some background foliage, though less successfully. Played with colors a bit. Tried to work out a tileset for underground areas.

  • music: I wrote like half of a little chiptune song that I actually like so far! I’m now seriously toying with the idea of doing my own music for fox flux. Played a bit with more sound effects, too.

  • blog: I wrote up the Eevee mugshot set for Doom I made, as an inaugural post for the release category.

  • veekun: Finished up Ultra Sun and Ultra Moon! Pokémon sprites, box sprites, item sprites, and the same data as Sun/Moon. I say “finished” but of course plenty of stuff is still missing, alas.

  • cc: I’m trying to make glip some building blocks so that they can actually start building the game, so I made some breakable blocks. Also wrote a little shader for implementing their parallax background, which involves a bunch of layer modes.

  • misc: I got a new keyboard. Also I installed umatrix because noscript’s web extension version is half-broken and driving me up the wall. Sorry, noscript.

Huh, that’s not a bad haul, despite a few nights of incredibly bad sleep. Cool.

Build a Flick-controlled marble maze

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/flick-marble-maze/

Wiggle your fingers to guide a ball through a 3D-printed marble maze using the Pi Supply Flick board for Raspberry Pi!

Wiggle, wiggle, wiggle, wiggle, yeah

Using the Flick, previously seen in last week’s Hacker House’s gesture-controlled holographic visualiser, South Africa–based Tom Van den Bon has created a touch-free marble maze. He was motivated by, if his Twitter is any indication, his love for game-making and 3D printing.

Tom Van den Bon on Twitter

Day 172 of #3dprint365. #3dprinted Raspberry PI Controlled Maze Thingie Part 3 #3dprint #3dprinter #thingiverse #raspberrypi #pisupply

All non-electronic parts of this build are 3D printed. The marble maze sits atop a motorised structure which moves along two axes thanks to servo motors. Tom controls the movement using gestures which are picked up by the Flick Zero, a Pi Zero–sized 3D-tracking board that can detect movement up to 15cm away.

Find the code for the maze, which takes advantage of the Flick library, on Tom’s GitHub account.

Make your own games

Our free resources are a treasure trove of fun home-brew games that you can build with your friends and family.

If you like physical games such as Tom’s gesture-controlled maze, you should definitely check out our Python quick reaction game! In it, players are pitted against each other to react as quickly as possible to a randomly lighting up LED.

raspberry pi marble maze

You can also play solo with our Lights out game, where it’s you against four erratic lights eager to remain lit.

For games you can build on your computer with no need for any extra tech, Scratch games such as our button-smashing Olympic weightlifter and Hurdler projects are perfect — you can play them just using a keyboard and browser!

raspberry pi marble maze

And if you’d like to really get stuck into learning about game development, then you’re in luck! CoderDojo’s Make your own game book guides you through all the steps of building a game in JavaScript, from creating the world to designing characters.

Cover of CoderDojo Nano Make your own game

And because I just found this while searching for image content for today’s blog, here is a photo of Eben’s and Liz’s cat Mooncake with a Raspberry Pi on her head. Enjoy!

A cat with a Raspberry Pi pin on its head — raspberry pi marble maze

Ras-purry Pi?

The post Build a Flick-controlled marble maze appeared first on Raspberry Pi.

A Raspberry Pi Halloween projects spectacular

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/halloween-projects-2017/

Come with us on a journey to discover the 2017 Raspberry Pi Halloween projects that caught our eye, raised our hair, or sent us screaming into the night.

A clip of someone being pulled towards a trap door by hands reaching up from it - Raspberry Pi Halloween projects

Happy Halloween

Whether you’re easily scared or practically unshakeable, you can celebrate Halloween with Pi projects of any level of creepiness.

Even makers of a delicate constitution will enjoy making this Code Club Ghostbusters game, or building an interactive board game using Halloween lights with this MagPi tutorial by Mike Cook. And how about a wearable, cheerily LED-enhanced pumpkin created with the help of this CoderDojo resource? Cute, no?

Felt pumpkin with blinking LED smiley face - Raspberry Pi Halloween projects

Speaking of wearables, Derek Woodroffe’s be-tentacled hat may writhe disconcertingly, but at least it won’t reach out for you. Although, you could make it do that, if you were a terrible person.

Slightly queasy Halloween

Your decorations don’t have to be terrifying: this carved Pumpkin Pi and the Poplawskis’ Halloween decorations are controlled remotely via the web, but they’re more likely to give you happy goosebumps than cold sweats.

A clip of blinking Halloween decorations covering a house - Raspberry Pi Halloween projects

The Snake Eyes Bonnet pumpkin and the monster-face projection controlled by Pis that we showed you in our Halloween Twitter round-up look fairly friendly. Even the 3D-printed jack-o’-lantern by wermy, creator of mintyPi, is kind of adorable, if you ignore the teeth. And who knows, that AlexaPi-powered talking skull that’s staring at you could be an affable fellow who just fancies a chat, right? Right?

Horror-struck Halloween

OK, fine. You’re after something properly frightening. How about the haunted magic mirror by Kapitein Haak, or this one, with added Philips Hue effects, by Ben Eagan. As if your face first thing in the morning wasn’t shocking enough.

Haunted magic mirror demonstration - Raspberry Pi Halloween projects

If you find those rigid-faced, bow-lipped, plastic dolls more sinister than sweet – and you’re right to do so: they’re horrible – you won’t like this evil toy. Possessed by an unquiet shade, it’s straight out of my nightmares.

Earlier this month we covered Adafruit’s haunted portrait how-to. This build by Dominick Marino takes that concept to new, terrifying, heights.

Haunted portrait project demo - Raspberry Pi Halloween projects

Why not add some motion-triggered ghost projections to your Halloween setup? They’ll go nicely with the face-tracking, self-winding, hair-raising jack-in-the-box you can make thanks to Sean Hodgins’ YouTube tutorial.

And then, last of all, there’s this.

The Saw franchise's Billy the puppet on a tricycle - Raspberry Pi Halloween projects

NO.

This recreation of Billy the Puppet from the Saw franchise is Pi-powered, it’s mobile, and it talks. You can remotely control it, and I am not even remotely OK with it. That being said, if you’re keen to have one of your own, be my guest. Just follow the guide on Instructables. It’s your funeral.

Make your Halloween

It’s been a great year for scary Raspberry Pi makes, and we hope you have a blast using your Pi to get into the Halloween spirit.

And speaking of spirits, Matt Reed of RedPepper has created a Pi-based ghost detector! It uses Google’s Speech Neural Network AI to listen for voices in the ether, and it’s live-streaming tonight. Perfect for watching while you’re waiting for the trick-or-treaters to show up.

The post A Raspberry Pi Halloween projects spectacular appeared first on Raspberry Pi.

Google Asked to Remove 3 Billion “Pirate” Search Results

Post Syndicated from Ernesto original https://torrentfreak.com/google-asked-to-remove-3-billion-pirate-search-results-171018/

Copyright holders continue to flood Google with DMCA takedown requests, asking the company to remove “pirate links” from its search results.

In recent years the number of reported URLs has exploded, surging to unprecedented heights.

Since Google first started to report the volume of takedown requests in its Transparency Report, the company has been asked to remove more than three billion allegedly infringing search results.

The frequency at which these URLs are reported has increased over the years and at the moment roughly three million ‘pirate’ URLs are submitted per day.

The URLs are sent in by major rightsholders including members of the BPI, RIAA, and various major Hollywood studios. They target a wide variety of sites, over 1.3 million, but a few dozen ‘repeat offenders’ are causing the most trouble.

File-hosting service 4shared.com currently tops the list of most-targeted domains with 66 million URLs, followed by the now-defunct MP3 download site MP3toys.xyz and Rapidgator.net, with 51 and 28 million URLs respectively.

3 billion URLs

Interestingly, the high volume of takedown notices is used as an argument for and against the DMCA process.

While Google believes that the millions of reported URLs per day are a sign that the DMCA takedown process is working correctly, rightsholders believe the volumes are indicative of an unbeatable game of whack-a-mole.

According to some copyright holders, the takedown efforts do little to seriously combat piracy. Various industry groups have therefore asked governments and lawmakers for broad revisions.

Among other things they want advanced technologies and processes to ensure that infringing content doesn’t reappear elsewhere once it’s removed, a so-called “notice and stay down” approach. In addition, Google has often been asked to demote pirate links in search results.

UK music industry group BPI, who are responsible for more than 10% of all the takedown requests on Google, sees the new milestone as an indicator of how much effort its anti-piracy activities take.

“This 3 billion figure shows how hard the creative sector has to work to police its content online and how much time and resource this takes. The BPI is the world’s largest remover of illegal music links from Google, one third of which are on behalf of independent record labels,” Geoff Taylor, BPI’s Chief Executive, informs TF.

However, there is also some progress to report. Earlier this year BPI announced a voluntary partnership with Google and Bing to demote pirate content faster and more effectively for US visitors.

“We now have a voluntary code of practice in place in the UK, facilitated by Government, that requires Google and Bing to work together with the BPI and other creator organizations to develop lasting solutions to the problem of illegal sites gaining popularity in search listings,” Taylor notes.

According to BPI, both Google and Bing have shown that changes to their algorithms can be effective in demoting the worst pirate sites from the top search results and they hope others will follow suit.

“Other intermediaries should follow this lead and take more responsibility to work with creators to reduce the proliferation of illegal links and disrupt the ability of illegal sites to capture consumers and build black market businesses that take money away from creators.”

Agreement or not, there are still plenty of pirate links in search results, so the BPI is still sending out millions of takedown requests per month.

We asked Google for a comment on the new milestone but at the time of writing, we have yet to hear back. In any event, the issue is bound to remain a hot topic during the months and years to come.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Sean Hodgins’ Haunted Jack in the Box

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/sean-hodgins-haunted-jack-box/

After making a delightful Bitcoin lottery using a Raspberry Pi, Sean Hodgins brings us more Pi-powered goodness in time for every maker’s favourite holiday: Easter! Just kidding, it’s Halloween. Check out his hair-raising new build, the Haunted Jack in the Box.

Haunted Jack in the Box – DIY Raspberry Pi Project

This project uses a raspberry pi and face detection using the pi camera to determine when someone is looking at it. Plenty of opportunities to scare people with it. You can make your own!

Haunted jack-in-the-box?

Imagine yourself wandering around a dimly lit house. Your eyes idly scan a shelf. Suddenly, out of nowhere, a twangy melody! What was that? You take a closer look…there seems to be a box in jolly colours…with a handle that’s spinning by itself?!

Sidling up to Sean Hodgins' Haunted Jack in the Box

What’s…going on?

You freeze, unable to peel your eyes away, and BAM!, out pops a maniacally grinning clown. You promptly pee yourself. Happy Halloween, courtesy of Sean Hodgins.

Clip of Sean Hodgins' Haunted Jack in the Box

Eerie disembodied voice: You’re welco-o-o-ome!

How has Sean built this?

Sean purchased a jack-in-the-box toy and replaced its bottom side with one that would hold the necessary electronic components. He 3D-printed this part, but says you could also just build it by hand.

The bottom of the box houses a Raspberry Pi 3 Model B and a servomotor which can turn the windup handle. There’s also a magnetic reed switch which helps the Pi decide when to trigger the Jack. Sean hooked up the components to the Pi’s GPIO pins, and used an elastic band as a drive belt to connect the pulleys on the motor and the handle.

Film clip showing the inside of Sean Hodgin's Haunted Jack in the Box

Sean explains that he has used a lot of double-sided tape and superglue in this build. The bottom and top are held together with two screws, because, as he describes it, “the Jack coming out is a little violent.”

In addition to his video walk-through, he provides build instructions on Instructables, Hackaday, Hackster, and Imgur — pick your poison. And be sure to subscribe to Sean’s YouTube channel to see what he comes up with next.

Wait, how does the haunted part work?

But if I explain it, it won’t be scary anymore! OK, fiiiine.

With the help of a a Camera Module and OpenCV, Sean implemented facial recognition: Jack knows when someone is looking at his box, and responds by winding up and popping out.

View of command line output of the Python script for Sean Hodgins' Haunted Jack in the Box

Testing the haunting script

Sean’s Python script is available here, but as he points out, there are many ways in which you could adapt this code, and the build itself, to be even more frightening.

So very haunted

What would you do with this build? Add creepy laughter? Soundbites from It? Lighting effects? Maybe even infrared light and a NoIR Camera Module, so that you can scare people in total darkness? There are so many possibilities for this project — tell us your idea in the comments.

The post Sean Hodgins’ Haunted Jack in the Box appeared first on Raspberry Pi.

Self-Driving Cars Should Be Open Source

Post Syndicated from Bozho original https://techblog.bozho.net/self-driving-cars-open-source/

Self-driving cars are (will be) the pinnacle of consumer products automation – robot vacuum cleaners, smart fridges and TVs are just toys compared to self-driving cars. Both in terms of technology and in terms of impact. We aren’t yet on level 5 self driving cars , but they are behind the corner.

But as software engineers we know how fragile software is. And self-driving cars are basically software, so we can see all the risks involved with putting our lives in the hands anonymous (from our point of view) developers and unknown (to us) processes and quality standards. One may argue that this has been the case for every consumer product ever, but with software is different – software is way more complex than anything else.

So I have an outrageous proposal – self-driving cars should be open source. We have to be able to verify and trust the code that’s navigating our helpless bodies around the highways. Not only that, but we have to be able to verify if it is indeed that code that is currently running in our car, and not something else.

In fact, let me extend that – all cars should be open source. Before you say “but that will ruin the competitive advantage of manufacturers and will be deadly for business”, I don’t actually care how they trained their neural networks, or what their datasets are. That’s actually the secret sauce of the self-driving car and in my view it can remain proprietary and closed. What I’d like to see open-sourced is everything else. (Under what license – I’d be fine to even have it copyrighted and so not “real” open source, but that’s a separate discussion).

Why? This story about remote carjacking using the entertainment system of a Jeep is a scary example. Attackers that reverse engineer the car software can remotely control everything in the car. Why did that happen? Well, I guess it’s complicated and we have to watch the DEFCON talk.

And also read the paper, but a paragraph in wikipedia about the CAN bus used in most cars gives us a hint:

CAN is a low-level protocol and does not support any security features intrinsically. There is also no encryption in standard CAN implementations, which leaves these networks open to man-in-the-middle packet interception. In most implementations, applications are expected to deploy their own security mechanisms; e.g., to authenticate incoming commands or the presence of certain devices on the network. Failure to implement adequate security measures may result in various sorts of attacks if the opponent manages to insert messages on the bus. While passwords exist for some safety-critical functions, such as modifying firmware, programming keys, or controlling antilock brake actuators, these systems are not implemented universally and have a limited number of seed/key pair

I don’t know in what world it makes sense to even have a link between the entertainment system and the low-level network that operates the physical controls. As apparent from the talk, the two systems are supposed to be air-gapped, but in reality they aren’t.

Rookie mistakes were abound – unauthenticated “execute” method, running as root, firmware is not signed, hard-coded passwords, etc. How do we know that there aren’t tons of those in all cars out there right now, and in the self-driving cars of the future (which will likely use the same legacy technologies of the current cars)? Recently I heard a negative comment about the source code of one of the self-driving cars “players”, and I’m pretty sure there are many of those rookie mistakes.

Why this is this even more risky for self-driving cars? I’m not an expert in car programming, but it seems like the attack surface is bigger. I might be completely off target here, but on a typical car you’d have to “just” properly isolate the CAN bus. With self-driving cars the autonomous system that watches the surrounding and makes decisions on what to do next has to be connected to the CAN bus. With Tesla being able to send updates over the wire, the attack surface is even bigger (although that’s actually a good feature – to be able to patch all cars immediately once a vulnerability is discovered).

Of course, one approach would be to introduce legislation that regulates car software. It might work, but it would rely on governments to to proper testing, which won’t always be the case.

The alternative is to open-source it and let all the white-hats find your issues, so that you can close them before the car hits the road. Not only that, but consumers like me will feel safer, and geeks would be able to verify whether the car is really running the software it claims to run by verifying the fingerprints.

Richard Stallman might be seen as a fanatic when he advocates against closed source software, but in cases like … cars, his concerns seem less extreme.

“But the Jeep vulnerability was fixed”, you may say. And that might be seen as being the way things are – vulnerabilities appear, they get fixed, life goes on. No person was injured because of the bug, right? Well, not yet. And “gaining control” is the extreme scenario – there are still pretty bad scenarios, like being able to track a car through its GPS, or cause panic by controlling the entertainment system. It might be over wifi, or over GPRS, or even by physically messing with the car by inserting a flash drive. Is open source immune to those issues? No, but it has proven to be more resilient.

One industry where the problem of proprietary software on a product that the customer bought is … tractors. It turns out farmers are hacking their tractors, because of multiple issues and the inability of the vendor to resolve them in a timely manner. This is likely to happen to cars soon, when only authorized repair shops are allowed to touch anything on the car. And with unauthorized repair shops the attack surface becomes even bigger.

In fact, I’d prefer open source not just for cars, but for all consumer products. The source code of a smart fridge or a security camera is trivial, it would rarely mean sacrificing competitive advantage. But refrigerators get hacked, security cameras are active part of botnets, the “internet of shit” is getting ubiquitous. A huge amount of these issues are dumb, beginner mistakes. We have the right to know what shit we are running – in our frdges, DVRs and ultimatey – cars.

Your fridge may soon by spying on you, your vacuum cleaner may threaten your pet in demand of “ransom”. The terrorists of the future may crash planes without being armed, can crash vans into crowds without being in the van, and can “explode” home equipment without being in the particular home. And that’s not just a hypothetical.

Will open source magically solve the issue? No. But it will definitely make things better and safer, as it has done with operating systems and web servers.

The post Self-Driving Cars Should Be Open Source appeared first on Bozho's tech blog.

3D print your own Rubik’s Cube Solver

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/rubiks-cube-solver/

Why use logic and your hands to solve a Rubik’s Cube, when you could 3D print your own Rubik’s Cube Solver and thus avoid overexerting your fingers and brain cells? Here to help you with this is Otvinta‘s new robotic make:

Fully 3D-Printed Rubik’s Cube Solving Robot

This 3D-printed Raspberry PI-powered Rubik’s Cube solving robot has everything any serious robot does — arms, servos, gears, vision, artificial intelligence and a task to complete. If you want to introduce robotics to your kids or your students, this is the perfect machine for it. This robot is fully 3D-printable.

Rubik’s Cubes

As Liz has said before, we have a lot of Rubik’s cubes here at Pi Towers. In fact, let me just…hold on…I’ll be right back.

Okay, these are all the ones I found on Gordon’s desk, and I’m 99% sure there are more in his drawers.

Raspberry Pi Rubik's Cube Solver

And that’s just Gordon. Given that there’s a multitude of other Pi Towers staff members who are also obsessed with the little twisty cube of wonder, you could use what you find in our office to restock an entire toy shop for the pre-Christmas rush!

So yeah, we like Rubik’s Cubes.

The 3D-Printable Rubik’s Cube Solver

Aside from the obvious electronic elements, Otvinta’s Rubik’s Cube Solving Robot is completely 3D-printable. While it may take a whopping 70 hours of print time and a whole spool of filament to make your solving robot a reality, we’ve seen far more time-consuming prints with a lot less purpose than this.

(If you’ve clicked the link above, I’d just like to point out that, while that build might be 3D printing overkill, I want one anyway.)

Rubik's Cube Solver

After 3D printing all the necessary parts of your Rubik’s Cube Solving Robot, you’ll need to run the Windows 10 IoT Core on your Raspberry Pi. Once connected to your network, you can select the Pi from the IoT Dashboard on your main PC and install the RubiksCubeRobot app.

Raspberry Pi Rubik's Cube Solver

Then simply configure the robot via the app, and you’re good to go!

You might not necessarily need a Raspberry Pi to create this build, since you could simply run the app on your main PC. However, using a Pi will make your project more manageable and less bulky.

You can find all the details of how to make your own Rubik’s Cube Solving Robot on Otvinta’s website, so do make sure to head over there if you want to learn more.

All the robots!

This isn’t the first Raspberry Pi-powered Rubik’s Cube out there, and it surely won’t be the last. There’s this one by Francesco Georg using LEGO Mindstorms; this one was originally shared on Reddit; Liz wrote about this one; and there’s one more which I can’t seem to find but I swear exists, and it looks like the Eye of Sauron! Ten House Points to whoever shares it with me in the comments below.

The post 3D print your own Rubik’s Cube Solver appeared first on Raspberry Pi.

Mod your Nerf gun with a Pi

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/mod-nerf-gun-pi/

Michael Darby, who blogs at 314reactor, has created a new Raspberry Pi build, and it’s pretty darn cool. Though it’s not the first Raspberry Pi-modded Nerf gun we’ve seen, it’s definitely one of the most complex!

Nerf Gun Ammo Counter / Range Finder – Raspberry Pi

An ammo counter and range finder made from a Raspberry Pi for a Nerf Gun.

Nerf guns

Nerf guns are toy dart guns that have been on the market since the early 1990s. They are popular with kids and adults who enjoy playing paintball, laser tag, and first-person shooter video games. Michael loves Nerf guns, and he wanted to give his toy a sci-fi overhaul, making it look and function more like a gun that an avatar might use in Half-Life, Quake, or Doom.

Modding a Nerf gun

A busy and creative member of the Raspberry Pi community, Michael has previously delighted us with his Windows 98 wristwatch. Now, he has upgraded his Nerf gun with a rangefinder and an ammo counter by adding a Pi, a Pimoroni Rainbow HAT, and some sensors.

Setting up a rangefinder was straightforward. Michael fixed an ultrasonic distance sensor pointing in the direction of the gun’s barrel. Live information about how far away he is from his target is shown on the Rainbow HAT’s alphanumeric display.

View of Michael Darby's nerf gun range finder

To create an ammo counter, Michael had to follow a more circuitous route. Since he couldn’t think of a way to read out how many darts are in the Nerf gun’s magazine, he ended up counting how many darts have been shot instead. This data is collected via a proximity sensor, a device that can measure shorter distances than an ultrasonic sensor. Michael aimed the sensor towards the end of the barrel, attaching it with Blu-Tack.

View of Michael Darby's nerf gun proximity sensor

The number of shots left in the magazine is indicated by the seven LEDs above the Rainbow HAT’s alphanumeric display. The countdown works for more than seven darts, thanks to colour coding: the LEDs count down first in red, then in orange, and finally in green.

In a Python script running on the Pi, Michael has included a default number of shots per magazine. When he changes a magazine, he uses one of the HAT’s buttons as a ‘Reload’ button, resetting the counter. He has also set up the HAT so that the number of available shots can be entered manually instead.

Nerf gun modding tutorial

On Michael’s blog you will find a thorough step-by-step guide to how he created this build. He has also included his code, and links to all the components, software installation guides, and test scripts he has used. So head on over there if you’re keen to mod your own nerf gun like this, and take a look at some of his other projects while you’re there!

Michael welcomes suggestions for how to improve upon his mods, especially for how to count shots in a magazine automatically. Do you have an idea? Let usand himknow in the comments!

Toy mods

Over the years, we’ve covered quite a few fun toy upgrades, and some that may have to be approached with caution. The Pi-powered busy board for babies, the ‘weaponized’ teddy bear, and the inevitable smart Fisher Price phone are just a few from our archives.

What’s your favourite childhood toy, and how could it be improved by the addition of a Pi? Share your ideas with us in the comments below.

The post Mod your Nerf gun with a Pi appeared first on Raspberry Pi.

Raspbian Stretch has arrived for Raspberry Pi

Post Syndicated from Simon Long original https://www.raspberrypi.org/blog/raspbian-stretch/

It’s now just under two years since we released the Jessie version of Raspbian. Those of you who know that Debian run their releases on a two-year cycle will therefore have been wondering when we might be releasing the next version, codenamed Stretch. Well, wonder no longer – Raspbian Stretch is available for download today!

Disney Pixar Toy Story Raspbian Stretch Raspberry Pi

Debian releases are named after characters from Disney Pixar’s Toy Story trilogy. In case, like me, you were wondering: Stretch is a purple octopus from Toy Story 3. Hi, Stretch!

The differences between Jessie and Stretch are mostly under-the-hood optimisations, and you really shouldn’t notice any differences in day-to-day use of the desktop and applications. (If you’re really interested, the technical details are in the Debian release notes here.)

However, we’ve made a few small changes to our image that are worth mentioning.

New versions of applications

Version 3.0.1 of Sonic Pi is included – this includes a lot of new functionality in terms of input/output. See the Sonic Pi release notes for more details of exactly what has changed.

Raspbian Stretch Raspberry Pi

The Chromium web browser has been updated to version 60, the most recent stable release. This offers improved memory usage and more efficient code, so you may notice it running slightly faster than before. The visual appearance has also been changed very slightly.

Raspbian Stretch Raspberry Pi

Bluetooth audio

In Jessie, we used PulseAudio to provide support for audio over Bluetooth, but integrating this with the ALSA architecture used for other audio sources was clumsy. For Stretch, we are using the bluez-alsa package to make Bluetooth audio work with ALSA itself. PulseAudio is therefore no longer installed by default, and the volume plugin on the taskbar will no longer start and stop PulseAudio. From a user point of view, everything should still work exactly as before – the only change is that if you still wish to use PulseAudio for some other reason, you will need to install it yourself.

Better handling of other usernames

The default user account in Raspbian has always been called ‘pi’, and a lot of the desktop applications assume that this is the current user. This has been changed for Stretch, so now applications like Raspberry Pi Configuration no longer assume this to be the case. This means, for example, that the option to automatically log in as the ‘pi’ user will now automatically log in with the name of the current user instead.

One other change is how sudo is handled. By default, the ‘pi’ user is set up with passwordless sudo access. We are no longer assuming this to be the case, so now desktop applications which require sudo access will prompt for the password rather than simply failing to work if a user without passwordless sudo uses them.

Scratch 2 SenseHAT extension

In the last Jessie release, we added the offline version of Scratch 2. While Scratch 2 itself hasn’t changed for this release, we have added a new extension to allow the SenseHAT to be used with Scratch 2. Look under ‘More Blocks’ and choose ‘Add an Extension’ to load the extension.

This works with either a physical SenseHAT or with the SenseHAT emulator. If a SenseHAT is connected, the extension will control that in preference to the emulator.

Raspbian Stretch Raspberry Pi

Fix for Broadpwn exploit

A couple of months ago, a vulnerability was discovered in the firmware of the BCM43xx wireless chipset which is used on Pi 3 and Pi Zero W; this potentially allows an attacker to take over the chip and execute code on it. The Stretch release includes a patch that addresses this vulnerability.

There is also the usual set of minor bug fixes and UI improvements – I’ll leave you to spot those!

How to get Raspbian Stretch

As this is a major version upgrade, we recommend using a clean image; these are available from the Downloads page on our site as usual.

Upgrading an existing Jessie image is possible, but is not guaranteed to work in every circumstance. If you wish to try upgrading a Jessie image to Stretch, we strongly recommend taking a backup first – we can accept no responsibility for loss of data from a failed update.

To upgrade, first modify the files /etc/apt/sources.list and /etc/apt/sources.list.d/raspi.list. In both files, change every occurrence of the word ‘jessie’ to ‘stretch’. (Both files will require sudo to edit.)

Then open a terminal window and execute

sudo apt-get update
sudo apt-get -y dist-upgrade

Answer ‘yes’ to any prompts. There may also be a point at which the install pauses while a page of information is shown on the screen – hold the ‘space’ key to scroll through all of this and then hit ‘q’ to continue.

Finally, if you are not using PulseAudio for anything other than Bluetooth audio, remove it from the image by entering

sudo apt-get -y purge pulseaudio*

The post Raspbian Stretch has arrived for Raspberry Pi appeared first on Raspberry Pi.

Community Profile: David Pride

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/community-profile-david-pride/

This column is from The MagPi issue 55. You can download a PDF of the full issue for free, or subscribe to receive the print edition in your mailbox or the digital edition on your tablet. All proceeds from the print and digital editions help the Raspberry Pi Foundation achieve its charitable goals.

David Pride’s experiences in computer education came slightly later in life. He admits to not being a grade-A student: he left school with few qualifications, unable to pursue further education at university. There was, however, a teacher who instilled in him a passion for computers and coding which would stick with him indefinitely.

David Pride The MagPi Raspberry Pi Community Profile

David joined us at the St James’s Palace community celebration, mingling with the likes of the Duke of York, plus organisers of Jams and clubs, such as Grace and Femi

Welcome to the Community

Twenty years later, back in 2012, David heard of the Raspberry Pi – a soon-to-be-released “new little marvel” that he instantly fell for, head first. Despite a lack of knowledge in Linux and Python, he experimented and had fun. He found a Raspberry Jam and, with it, Pi enthusiasts like Mike Horne and Peter Onion. The projects on display at the Jam were enough to push David further into the Raspberry Pi rabbit hole and, after working his way through several Python books, he began to take steps into the world of formal higher education.

David Pride The MagPi Raspberry Pi Community Profile

David’s determination to access and complete further education in computing has earned him a three-year PhD studentship. Not bad for a “lousy student”

Back to School

With a Mooc qualification from Rice University under his belt, he continued to improve upon his self-taught knowledge, and was fortunate enough to be accepted to study for a master’s degree in Computer Science at the University of Hertfordshire. With a distinction for his final dissertation, David completed the course with an overall distinction for his MSc, and was recently awarded a fully funded PhD studentship with The Open University’s Knowledge Media Institute.

David Pride The MagPi Raspberry Pi Community Profile

Self-playing xylophones, Wiimote air drums, Lego sorters, Pi Wars robots, and more. David is continually hacking toys, giving them new Pi-powered life

Maker of things

The portfolio of projects that helped him to achieve his many educational successes has provided regular retweet material for the Raspberry Pi Twitter account, and we’ve highlighted his fun, imaginative work on this blog before. His builds have travelled to a range of Jams and made their way to the Raspberry Pi and Code Club stands at the Bett Show, as well as to our birthday celebrations.

David Pride The MagPi Raspberry Pi Community Profile

“Pi & Chips – with a little extra source”

His website, the pun-tastic Pi and Chips, is home to the majority of his work; David also links to YouTube videos and walk-throughs of his projects, and relates his experiences at various events. If you’ve followed any of the action across the Raspberry Pi social media channels – or indeed read any previous issues of The MagPi magazine – you’ll no doubt have seen a couple of David’s projects.

David Pride The MagPi Raspberry Pi Community Profile 4-Bot

Many readers will have come across the wonderful 4-Bot before, and it has even made an appearance alongside David in a recent Bloomberg interview. Considering the trillions of possible game positions, David made a compromise and, if you’re lucky, you may just be able to beat it

The 4-Bot, a robotic second player for the family game Connect Four, allows people to go head to head with a Pi-powered robotic arm. Using a Python imaging library, the 4-Bot splits the game grid into 42 squares, and recognises them as being red, yellow, or empty by reading the RGB value of the space. Using the minimax algorithm, 4-Bot is able to play each move within 25 seconds. Believe us when we say that it’s not as easy to beat as you’d hope. Then there’s his more recent air drum kit, which uses an old toy found at a car boot sale together with a Wiimote to make a functional air drum that showcases David’s toy-hacking abilities… and his complete lack of rhythm. He does fare much better on his homemade laser harp, though!

The post Community Profile: David Pride appeared first on Raspberry Pi.

We Are Not Having a Productive Debate About Women in Tech

Post Syndicated from Bozho original https://techblog.bozho.net/not-productive-debate-women-tech/

Yes, it’s about the “anti-diversity memo”. But I won’t go into particular details of the memo, the firing, who’s right and wrong, who’s liberal and who’s conservative. Actually, I don’t need to repeat this post, which states almost exactly what I think about the particular issue. Just in case, and before someone decided to label me as “sexist white male” that knows nothing, I guess should clearly state that I acknowledge that biases against women are real and that I strongly support equal opportunity, and I think there must be more women in technology. I also have to state that I think the author of “the memo” was well-meaning, had some well argued, research-backed points and should not be ostracized.

But I want to “rant” about the quality of the debate. On one side we have conservatives who are throwing themselves in defense of the fired googler, insisting that liberals are banning conservative points of view, that it is normal to have so few woman in tech and that everything is actually okay, or even that women are inferior. On the other side we have triggered liberals that are ready to shout “discrimination” and “harassment” at anything that resembles an attempt to claim anything different than total and absolute equality, in many cases using a classical “strawman” argument (e.g. “he’s saying women should not work in tech, he’s obviously wrong”).

Everyone seems to be too eager to take side and issue a verdict on who’s right and who’s wrong, to blame the other side for all related and unrelated woes and while doing that, exhibit a huge amount of biases. If the debate is about that, we’d better shut it down as soon as possible, as it’s not going to lead anywhere. No matter how much conservatives want “a debate”, and no matter how much liberals want to advance equality. Oh, and by the way – this “conservatives” vs “liberals” is a false dichotomy. Most people hold a somewhat sensible stance in between. But let’s get to the actual issue:

Women are underrepresented in STEM (Science, technology, engineering, mathematics). That is a fact everyone agrees on and is blatantly obvious when you walk in any software company office.

Why is that the case? The whole debate revolved around biological and social differences, some of which are probably even true – that women value job flexibility more than being promoted or getting higher salary, that they are more neurotic (on average), that they are less confident, that they are more empathic and so on. These difference have been studied and documented, and as much as I have my reservations about psychology studies (so much so, that even meta-analysis are shown by meta-meta-analysis to be flawed) and social science in general, there seems to be a consensus there (by the way, it’s a shame that Gizmodo removed all the scientific references when they first published “the memo”). But that is not the issue. As it has been pointed out, there’s equal applicability of male and female “inherent” traits when working with technology.

Why are we talking about “techonology”, and why not “mining and construction”, as many will point out. Let’s cut that argument once and for all – mining and construction are blue collar jobs that have a high chance of being automated in the near future and are in decline. The problem that we’re trying to solve is – how to make the dominant profession of the future – information technology – one of equal opportunity. Yes, it’s a a bold claim, but software is going to be everywhere and the industry will grow. This is why it’s so important to discuss it, not because we are developers and we are somewhat affected by that.

So, there has been extended research on the matter, and the reasons are – surprise – complex and intertwined and there is no simple issue that, once resolved, will unlock the path of women to tech jobs.

What would diversity give us and why should we care? Let’s assume for a moment we don’t care about equal opportunity and we are right-leaning, conservative people. Well, imagine you have a growing business and you need to hire developers. What would you prefer – having fewer or more people of whom to choose from? Having fewer or more diverse skills (technical and social) on the job market? The answer is obvious. The more people, regardless of their gender, race, whatever, are on the job market, the better for businesses.

So I guess we’ve agreed on the two points so far – that women are underrepresented, and that it’s better for everyone if there are more people with technical skills on the job market, which includes more women.

The “final” questions is – how?

And this questions seems to not be anywhere in the discussion. Instead, we are going in circles with irrelevant arguments trying to either show that we’ve read more scientific papers than others, that we are more liberal than others or that we are more pro free speech.

Back to “how” – in Bulgaria we have a social meme: “I don’t know what is the right way, but the way you are doing it is NOT the right way”. And much of the underlying sentiment of “the memo” is similar – that google should stop doing some of the stuff it is doing about diversity, or do them differently (but doesn’t tell us how exactly). Hiring biases, internal programs, whatever, seem to bother him. But this is just talking about the surface of the problem. These programs are correcting something that remains hidden in “the memo”.

Google, on their diversity page, say that 20% of their tech employees are women. At the same time, in another diversity section, they claim “18% of CS graduates are women”. So, I guess, job done – they’ve reached the maximum possible diversity. They’ve hired as many women in tech as CS graduates there are. Anything more than that, even if it doesn’t mean they’ll hire worse developers, will leave the rest of the industry with less women. So, sure, 50/50 in Google would sound cool, but the industry average will still be bad.

And that’s the actual, underlying reason that we should have already arrived at, and we should’ve started discussing the “how”. Girls do not see STEM as a thing for them. Our biases are projected on younger girls which culminate at a “this is not for girls” mantra. No matter how diverse hiring policies we have, if we don’t address the issue at a way earlier stage, we aren’t getting anywhere.

In schools and even kindergartens we need to have an inclusive environment where “this is not for girls” is frowned upon. We should not discourage girls from liking math, or making math sound uncool and “hard for girls” (in my biased world I actually know more women mathematicians than men). This comic seems like on a different topic (gender-specific toys), but it’s actually not about toys – it’s about what is considered (stereo)typical of a girl to do. And most of these biases are unconscious, and come from all around us (school, TV, outdoor ads, people on the street, relatives, etc.), and it takes effort to confront them.

To do that, we need policy decisions. We need lobbying education departments / ministries to encourage girls more in the STEM direction (and don’t worry, they’ll be good at it). By the way, guess what – Google’s diversity program is not just about hiring more women, it actually includes education policies with stuff like “influencing perception about computer science”, “getting more girls to code” and scholarships.

Let’s discuss the education policies, the path to getting 40-50% of CS graduates to be female, and before that – more girls in schools with technical focus, and ultimately – how to get society to not perceive technology and science as “not for girls”. Let each girl decide on her own. All the other debates are short-sighted and not to the point at all. Will biological differences matter then? They probably will – but not significantly to justify a high gender imbalance.

I am no expert in education policies and I don’t know what will work and what won’t. There is research on the matter that we should look at, and maybe argue about it. Everything else is wasted keystrokes.

The post We Are Not Having a Productive Debate About Women in Tech appeared first on Bozho's tech blog.

Growing up alongside tech

Post Syndicated from Eevee original https://eev.ee/blog/2017/08/09/growing-up-alongside-tech/

IndustrialRobot asks… or, uh, asked last month:

industrialrobot: How has your views on tech changed as you’ve got older?

This is so open-ended that it’s actually stumped me for a solid month. I’ve had a surprisingly hard time figuring out where to even start.


It’s not that my views of tech have changed too much — it’s that they’ve changed very gradually. Teasing out and explaining any one particular change is tricky when it happened invisibly over the course of 10+ years.

I think a better framework for this is to consider how my relationship to tech has changed. It’s gone through three pretty distinct phases, each of which has strongly colored how I feel and talk about technology.

Act I

In which I start from nothing.

Nothing is an interesting starting point. You only really get to start there once.

Learning something on my own as a kid was something of a magical experience, in a way that I don’t think I could replicate as an adult. I liked computers; I liked toying with computers; so I did that.

I don’t know how universal this is, but when I was a kid, I couldn’t even conceive of how incredible things were made. Buildings? Cars? Paintings? Operating systems? Where does any of that come from? Obviously someone made them, but it’s not the sort of philosophical point I lingered on when I was 10, so in the back of my head they basically just appeared fully-formed from the æther.

That meant that when I started trying out programming, I had no aspirations. I couldn’t imagine how far I would go, because all the examples of how far I would go were completely disconnected from any idea of human achievement. I started out with BASIC on a toy computer; how could I possibly envision a connection between that and something like a mainstream video game? Every new thing felt like a new form of magic, so I couldn’t conceive that I was even in the same ballpark as whatever process produced real software. (Even seeing the source code for GORILLAS.BAS, it didn’t quite click. I didn’t think to try reading any of it until years after I’d first encountered the game.)

This isn’t to say I didn’t have goals. I invented goals constantly, as I’ve always done; as soon as I learned about a new thing, I’d imagine some ways to use it, then try to build them. I produced a lot of little weird goofy toys, some of which entertained my tiny friend group for a couple days, some of which never saw the light of day. But none of it felt like steps along the way to some mountain peak of mastery, because I didn’t realize the mountain peak was even a place that could be gone to. It was pure, unadulterated (!) playing.

I contrast this to my art career, which started only a couple years ago. I was already in my late 20s, so I’d already spend decades seeing a very broad spectrum of art: everything from quick sketches up to painted masterpieces. And I’d seen the people who create that art, sometimes seen them create it in real-time. I’m even in a relationship with one of them! And of course I’d already had the experience of advancing through tech stuff and discovering first-hand that even the most amazing software is still just code someone wrote.

So from the very beginning, from the moment I touched pencil to paper, I knew the possibilities. I knew that the goddamn Sistine Chapel was something I could learn to do, if I were willing to put enough time in — and I knew that I’m not, so I’d have to settle somewhere a ways before that. I knew that I’d have to put an awful lot of work in before I’d be producing anything very impressive.

I did it anyway (though perhaps waited longer than necessary to start), but those aren’t things I can un-know, and so I can never truly explore art from a place of pure ignorance. On the other hand, I’ve probably learned to draw much more quickly and efficiently than if I’d done it as a kid, precisely because I know those things. Now I can decide I want to do something far beyond my current abilities, then go figure out how to do it. When I was just playing, that kind of ambition was impossible.


So, I played.

How did this affect my views on tech? Well, I didn’t… have any. Learning by playing tends to teach you things in an outward sprawl without many abrupt jumps to new areas, so you don’t tend to run up against conflicting information. The whole point of opinions is that they’re your own resolution to a conflict; without conflict, I can’t meaningfully say I had any opinions. I just accepted whatever I encountered at face value, because I didn’t even know enough to suspect there could be alternatives yet.

Act II

That started to seriously change around, I suppose, the end of high school and beginning of college. I was becoming aware of this whole “open source” concept. I took classes that used languages I wouldn’t otherwise have given a second thought. (One of them was Python!) I started to contribute to other people’s projects. Eventually I even got a job, where I had to work with other people. It probably also helped that I’d had to maintain my own old code a few times.

Now I was faced with conflicting subjective ideas, and I had to form opinions about them! And so I did. With gusto. Over time, I developed an idea of what was Right based on experience I’d accrued. And then I set out to always do things Right.

That’s served me decently well with some individual problems, but it also led me to inflict a lot of unnecessary pain on myself. Several endeavors languished for no other reason than my dissatisfaction with the architecture, long before the basic functionality was done. I started a number of “pure” projects around this time, generic tools like imaging libraries that I had no direct need for. I built them for the sake of them, I guess because I felt like I was improving some niche… but of course I never finished any. It was always in areas I didn’t know that well in the first place, which is a fine way to learn if you have a specific concrete goal in mind — but it turns out that building a generic library for editing images means you have to know everything about images. Perhaps that ambition went a little haywire.

I’ve said before that this sort of (self-inflicted!) work was unfulfilling, in part because the best outcome would be that a few distant programmers’ lives are slightly easier. I do still think that, but I think there’s a deeper point here too.

In forgetting how to play, I’d stopped putting any of myself in most of the work I was doing. Yes, building an imaging library is kind of a slog that someone has to do, but… I assume the people who work on software like PIL and ImageMagick are actually interested in it. The few domains I tried to enter and revolutionize weren’t passions of mine; I just happened to walk through the neighborhood one day and decided I could obviously do it better.

Not coincidentally, this was the same era of my life that led me to write stuff like that PHP post, which you may notice I am conspicuously not even linking to. I don’t think I would write anything like it nowadays. I could see myself approaching the same subject, but purely from the point of view of language design, with more contrasts and tradeoffs and less going for volume. I certainly wouldn’t lead off with inflammatory puffery like “PHP is a community of amateurs”.

Act III

I think I’ve mellowed out a good bit in the last few years.

It turns out that being Right is much less important than being Not Wrong — i.e., rather than trying to make something perfect that can be adapted to any future case, just avoid as many pitfalls as possible. Code that does something useful has much more practical value than unfinished code with some pristine architecture.

Nowhere is this more apparent than in game development, where all code is doomed to be crap and the best you can hope for is to stem the tide. But there’s also a fixed goal that’s completely unrelated to how the code looks: does the game work, and is it fun to play? Yes? Ship the damn thing and forget about it.

Games are also nice because it’s very easy to pour my own feelings into them and evoke feelings in the people who play them. They’re mine, something with my fingerprints on them — even the games I’ve built with glip have plenty of my own hallmarks, little touches I added on a whim or attention to specific details that I care about.

Maybe a better example is the Doom map parser I started writing. It sounds like a “pure” problem again, except that I actually know an awful lot about the subject already! I also cleverly (accidentally) released some useful results of the work I’ve done thusfar — like statistics about Doom II maps and a few screenshots of flipped stock maps — even though I don’t think the parser itself is far enough along to release yet. The tool has served a purpose, one with my fingerprints on it, even without being released publicly. That keeps it fresh in my mind as something interesting I’d like to keep working on, eventually. (When I run into an architecture question, I step back for a while, or I do other work in the hopes that the solution will reveal itself.)

I also made two simple Pokémon ROM hacks this year, despite knowing nothing about Game Boy internals or assembly when I started. I just decided I wanted to do an open-ended thing beyond my reach, and I went to do it, not worrying about cleanliness and willing to accept a bumpy ride to get there. I played, but in a more experienced way, invoking the stuff I know (and the people I’ve met!) to help me get a running start in completely unfamiliar territory.


This feels like a really fine distinction that I’m not sure I’m doing justice. I don’t know if I could’ve appreciated it three or four years ago. But I missed making toys, and I’m glad I’m doing it again.

In short, I forgot how to have fun with programming for a little while, and I’ve finally started to figure it out again. And that’s far more important than whether you use PHP or not.

Disney Ditching Netflix Keeps Piracy Relevant

Post Syndicated from Ernesto original https://torrentfreak.com/disney-ditching-netflix-keeps-piracy-relevant-170809/

There is little doubt that, in the United States, Netflix has become the standard for watching movies on the Internet.

The subscription service is responsible for a third of all Internet traffic during peak hours, dwarfing that of online piracy and other legal video platforms.

It’s safe to assume that Netflix-type streaming services are among the best and most convenient alternative to piracy at this point. There is a problem though. The whole appeal of the streaming model becomes diluted when there are too many ‘Netflixes.’

Yesterday, Disney announced that it will end its partnership with Netflix in 2019. The company is working on its own Disney-branded movie streaming platforms, where titles such as Frozen 2 and Toy Story 4 will end up in the future.

Disney titles are among the most-watched content on Netflix, and the company’s stock took a hit when the news came out. In a statement late yesterday, Disney CEO Bob noted that the company has a good relationship with Netflix but the companies will part ways at the end of next year.

At the moment no decision has been made on what happens to Lucasfilm and Marvel films, but these could find a new home as well. Marvel TV shows such as Jessica Jones and Luke Cage will reportedly stay at Netflix

Although Disney’s decision may be good for Disney, a lot of Netflix users are not going to be happy. It likely means that they need another streaming platform subscription to get what they want, which isn’t a very positive prospect.

In piracy discussions, Hollywood insiders often stress that people have no reason to pirate, as pretty much all titles are available online legally. What they don’t mention, however, is that users need access to a few dozen paid services, to access them all.

In a way, this fragmentation is keeping the pirate ecosystems intact. While legal streaming services work just fine, having dozens of subscriptions is expensive, and not very practical. Especially not compared to pirate streaming sites, where everything can be accessed on the same site.

The music business has a better model, or had initially. Services such as Spotify allowed fans to access most popular music in one place, although that’s starting to crumble as well, due to exclusive deals and more fragmentation.

Admittedly, for a no-name observer, it’s easy to criticize and point fingers. The TV and movie business is built on complicated licensing deals, where a single Netflix may not be able to generate enough revenue for an entire industry.

But there has to be a better way than simply adding more streaming platforms, one would think?

Instead of solely trying to stamp down on pirate sites, it might be a good idea to take a careful look at the supply side as well. At the moment, fragmentation is keeping pirate sites relevant.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.